[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 8454 1726882401.45622: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-4FB executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8454 1726882401.46192: Added group all to inventory 8454 1726882401.46195: Added group ungrouped to inventory 8454 1726882401.46200: Group all now contains ungrouped 8454 1726882401.46203: Examining possible inventory source: /tmp/network-lQx/inventory.yml 8454 1726882401.67366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8454 1726882401.67442: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8454 1726882401.67469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8454 1726882401.67547: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8454 1726882401.67643: Loaded config def from plugin (inventory/script) 8454 1726882401.67645: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8454 1726882401.67696: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8454 1726882401.67808: Loaded config def from plugin (inventory/yaml) 8454 1726882401.67811: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8454 1726882401.67918: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8454 1726882401.68481: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8454 1726882401.68485: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8454 1726882401.68489: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8454 1726882401.68496: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8454 1726882401.68501: Loading data from /tmp/network-lQx/inventory.yml 8454 1726882401.68595: /tmp/network-lQx/inventory.yml was not parsable by auto 8454 1726882401.68676: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8454 1726882401.68722: Loading data from /tmp/network-lQx/inventory.yml 8454 1726882401.68827: group all already in inventory 8454 1726882401.68837: set inventory_file for managed_node1 8454 1726882401.68842: set inventory_dir for managed_node1 8454 1726882401.68843: Added host managed_node1 to inventory 8454 1726882401.68846: Added host managed_node1 to group all 8454 1726882401.68848: set ansible_host for managed_node1 8454 1726882401.68849: set ansible_ssh_extra_args for managed_node1 8454 1726882401.68854: set inventory_file for managed_node2 8454 1726882401.68857: set inventory_dir for managed_node2 8454 1726882401.68858: Added host managed_node2 to inventory 8454 1726882401.68860: Added host managed_node2 to group all 8454 1726882401.68861: set ansible_host for managed_node2 8454 1726882401.68862: set ansible_ssh_extra_args for managed_node2 8454 1726882401.68866: set inventory_file for managed_node3 8454 1726882401.68868: set inventory_dir for managed_node3 8454 1726882401.68869: Added host managed_node3 to inventory 8454 1726882401.68871: Added host managed_node3 to group all 8454 1726882401.68872: set ansible_host for managed_node3 8454 1726882401.68873: set ansible_ssh_extra_args for managed_node3 8454 1726882401.68876: Reconcile groups and hosts in inventory. 8454 1726882401.68881: Group ungrouped now contains managed_node1 8454 1726882401.68884: Group ungrouped now contains managed_node2 8454 1726882401.68886: Group ungrouped now contains managed_node3 8454 1726882401.68979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8454 1726882401.69141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8454 1726882401.69205: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8454 1726882401.69249: Loaded config def from plugin (vars/host_group_vars) 8454 1726882401.69252: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8454 1726882401.69261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8454 1726882401.69271: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8454 1726882401.69326: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8454 1726882401.69704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882401.69818: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8454 1726882401.69876: Loaded config def from plugin (connection/local) 8454 1726882401.69880: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8454 1726882401.70770: Loaded config def from plugin (connection/paramiko_ssh) 8454 1726882401.70774: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8454 1726882401.71919: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8454 1726882401.71975: Loaded config def from plugin (connection/psrp) 8454 1726882401.71978: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8454 1726882401.72998: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8454 1726882401.73056: Loaded config def from plugin (connection/ssh) 8454 1726882401.73059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8454 1726882401.75473: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8454 1726882401.75528: Loaded config def from plugin (connection/winrm) 8454 1726882401.75532: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8454 1726882401.75571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8454 1726882401.75648: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8454 1726882401.75751: Loaded config def from plugin (shell/cmd) 8454 1726882401.75753: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8454 1726882401.75785: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8454 1726882401.75885: Loaded config def from plugin (shell/powershell) 8454 1726882401.75887: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8454 1726882401.75956: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8454 1726882401.76210: Loaded config def from plugin (shell/sh) 8454 1726882401.76213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8454 1726882401.76258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8454 1726882401.76431: Loaded config def from plugin (become/runas) 8454 1726882401.76435: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8454 1726882401.76698: Loaded config def from plugin (become/su) 8454 1726882401.76701: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8454 1726882401.76923: Loaded config def from plugin (become/sudo) 8454 1726882401.76926: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8454 1726882401.76970: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 8454 1726882401.77405: in VariableManager get_vars() 8454 1726882401.77431: done with get_vars() 8454 1726882401.77591: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8454 1726882401.81026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8454 1726882401.81168: in VariableManager get_vars() 8454 1726882401.81173: done with get_vars() 8454 1726882401.81176: variable 'playbook_dir' from source: magic vars 8454 1726882401.81178: variable 'ansible_playbook_python' from source: magic vars 8454 1726882401.81179: variable 'ansible_config_file' from source: magic vars 8454 1726882401.81179: variable 'groups' from source: magic vars 8454 1726882401.81180: variable 'omit' from source: magic vars 8454 1726882401.81181: variable 'ansible_version' from source: magic vars 8454 1726882401.81182: variable 'ansible_check_mode' from source: magic vars 8454 1726882401.81183: variable 'ansible_diff_mode' from source: magic vars 8454 1726882401.81184: variable 'ansible_forks' from source: magic vars 8454 1726882401.81185: variable 'ansible_inventory_sources' from source: magic vars 8454 1726882401.81186: variable 'ansible_skip_tags' from source: magic vars 8454 1726882401.81187: variable 'ansible_limit' from source: magic vars 8454 1726882401.81188: variable 'ansible_run_tags' from source: magic vars 8454 1726882401.81188: variable 'ansible_verbosity' from source: magic vars 8454 1726882401.81229: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 8454 1726882401.82172: in VariableManager get_vars() 8454 1726882401.82191: done with get_vars() 8454 1726882401.82202: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 8454 1726882401.83537: in VariableManager get_vars() 8454 1726882401.83555: done with get_vars() 8454 1726882401.83565: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8454 1726882401.83700: in VariableManager get_vars() 8454 1726882401.83719: done with get_vars() 8454 1726882401.83899: in VariableManager get_vars() 8454 1726882401.83915: done with get_vars() 8454 1726882401.83926: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8454 1726882401.84022: in VariableManager get_vars() 8454 1726882401.84042: done with get_vars() 8454 1726882401.84408: in VariableManager get_vars() 8454 1726882401.84424: done with get_vars() 8454 1726882401.84430: variable 'omit' from source: magic vars 8454 1726882401.84455: variable 'omit' from source: magic vars 8454 1726882401.84500: in VariableManager get_vars() 8454 1726882401.84514: done with get_vars() 8454 1726882401.84574: in VariableManager get_vars() 8454 1726882401.84590: done with get_vars() 8454 1726882401.84633: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 8454 1726882401.84927: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 8454 1726882401.85108: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 8454 1726882401.86042: in VariableManager get_vars() 8454 1726882401.86066: done with get_vars() 8454 1726882401.86589: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 8454 1726882401.86776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8454 1726882401.88854: in VariableManager get_vars() 8454 1726882401.88876: done with get_vars() 8454 1726882401.88885: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8454 1726882401.89076: in VariableManager get_vars() 8454 1726882401.89099: done with get_vars() 8454 1726882401.89253: in VariableManager get_vars() 8454 1726882401.89276: done with get_vars() 8454 1726882401.89650: in VariableManager get_vars() 8454 1726882401.89672: done with get_vars() 8454 1726882401.89678: variable 'omit' from source: magic vars 8454 1726882401.89714: variable 'omit' from source: magic vars 8454 1726882401.89772: in VariableManager get_vars() 8454 1726882401.89790: done with get_vars() 8454 1726882401.89815: in VariableManager get_vars() 8454 1726882401.89836: done with get_vars() 8454 1726882401.89872: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 8454 1726882401.90026: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 8454 1726882401.91571: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 8454 1726882401.92123: in VariableManager get_vars() 8454 1726882401.92152: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8454 1726882401.94776: in VariableManager get_vars() 8454 1726882401.94801: done with get_vars() 8454 1726882401.94811: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 8454 1726882401.95450: in VariableManager get_vars() 8454 1726882401.95474: done with get_vars() 8454 1726882401.95548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8454 1726882401.95568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8454 1726882401.95830: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8454 1726882401.96066: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8454 1726882401.96069: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 8454 1726882401.96109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8454 1726882401.96143: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8454 1726882401.96383: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8454 1726882401.96470: Loaded config def from plugin (callback/default) 8454 1726882401.96473: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8454 1726882401.97897: Loaded config def from plugin (callback/junit) 8454 1726882401.97900: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8454 1726882401.97956: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8454 1726882401.98048: Loaded config def from plugin (callback/minimal) 8454 1726882401.98051: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8454 1726882401.98100: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8454 1726882401.98181: Loaded config def from plugin (callback/tree) 8454 1726882401.98184: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8454 1726882401.98351: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8454 1726882401.98354: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 8454 1726882401.98383: in VariableManager get_vars() 8454 1726882401.98396: done with get_vars() 8454 1726882401.98402: in VariableManager get_vars() 8454 1726882401.98413: done with get_vars() 8454 1726882401.98417: variable 'omit' from source: magic vars 8454 1726882401.98462: in VariableManager get_vars() 8454 1726882401.98477: done with get_vars() 8454 1726882401.98502: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 8454 1726882401.99126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8454 1726882401.99213: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8454 1726882401.99246: getting the remaining hosts for this loop 8454 1726882401.99248: done getting the remaining hosts for this loop 8454 1726882401.99251: getting the next task for host managed_node3 8454 1726882401.99255: done getting next task for host managed_node3 8454 1726882401.99257: ^ task is: TASK: Gathering Facts 8454 1726882401.99259: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882401.99262: getting variables 8454 1726882401.99263: in VariableManager get_vars() 8454 1726882401.99275: Calling all_inventory to load vars for managed_node3 8454 1726882401.99279: Calling groups_inventory to load vars for managed_node3 8454 1726882401.99283: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882401.99298: Calling all_plugins_play to load vars for managed_node3 8454 1726882401.99312: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882401.99317: Calling groups_plugins_play to load vars for managed_node3 8454 1726882401.99378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882401.99443: done with get_vars() 8454 1726882401.99451: done getting variables 8454 1726882401.99520: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Friday 20 September 2024 21:33:21 -0400 (0:00:00.013) 0:00:00.013 ****** 8454 1726882401.99547: entering _queue_task() for managed_node3/gather_facts 8454 1726882401.99549: Creating lock for gather_facts 8454 1726882401.99936: worker is 1 (out of 1 available) 8454 1726882401.99949: exiting _queue_task() for managed_node3/gather_facts 8454 1726882401.99965: done queuing things up, now waiting for results queue to drain 8454 1726882401.99967: waiting for pending results... 8454 1726882402.00257: running TaskExecutor() for managed_node3/TASK: Gathering Facts 8454 1726882402.00318: in run() - task 0affe814-3a2d-f59f-16b9-0000000000cc 8454 1726882402.00344: variable 'ansible_search_path' from source: unknown 8454 1726882402.00398: calling self._execute() 8454 1726882402.00473: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882402.00498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882402.00515: variable 'omit' from source: magic vars 8454 1726882402.00648: variable 'omit' from source: magic vars 8454 1726882402.00716: variable 'omit' from source: magic vars 8454 1726882402.00749: variable 'omit' from source: magic vars 8454 1726882402.00807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882402.00934: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882402.00939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882402.00943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882402.00946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882402.00978: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882402.00992: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882402.01002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882402.01143: Set connection var ansible_connection to ssh 8454 1726882402.01168: Set connection var ansible_shell_executable to /bin/sh 8454 1726882402.01185: Set connection var ansible_timeout to 10 8454 1726882402.01194: Set connection var ansible_shell_type to sh 8454 1726882402.01209: Set connection var ansible_pipelining to False 8454 1726882402.01221: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882402.01253: variable 'ansible_shell_executable' from source: unknown 8454 1726882402.01341: variable 'ansible_connection' from source: unknown 8454 1726882402.01344: variable 'ansible_module_compression' from source: unknown 8454 1726882402.01347: variable 'ansible_shell_type' from source: unknown 8454 1726882402.01349: variable 'ansible_shell_executable' from source: unknown 8454 1726882402.01352: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882402.01354: variable 'ansible_pipelining' from source: unknown 8454 1726882402.01357: variable 'ansible_timeout' from source: unknown 8454 1726882402.01360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882402.01551: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882402.01568: variable 'omit' from source: magic vars 8454 1726882402.01579: starting attempt loop 8454 1726882402.01599: running the handler 8454 1726882402.01620: variable 'ansible_facts' from source: unknown 8454 1726882402.01646: _low_level_execute_command(): starting 8454 1726882402.01662: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882402.02499: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882402.02542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882402.02555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882402.02587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882402.02652: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882402.02707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882402.02732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882402.02773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882402.02900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882402.04938: stdout chunk (state=3): >>>/root <<< 8454 1726882402.04999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882402.05002: stderr chunk (state=3): >>><<< 8454 1726882402.05100: stdout chunk (state=3): >>><<< 8454 1726882402.05104: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882402.05131: _low_level_execute_command(): starting 8454 1726882402.05146: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713 `" && echo ansible-tmp-1726882402.0511794-8467-280613591612713="` echo /root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713 `" ) && sleep 0' 8454 1726882402.05878: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882402.05940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882402.06031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882402.06060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882402.06092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882402.06241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882402.08342: stdout chunk (state=3): >>>ansible-tmp-1726882402.0511794-8467-280613591612713=/root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713 <<< 8454 1726882402.08642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882402.08648: stdout chunk (state=3): >>><<< 8454 1726882402.08650: stderr chunk (state=3): >>><<< 8454 1726882402.08653: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882402.0511794-8467-280613591612713=/root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882402.08655: variable 'ansible_module_compression' from source: unknown 8454 1726882402.08861: ANSIBALLZ: Using generic lock for ansible.legacy.setup 8454 1726882402.08865: ANSIBALLZ: Acquiring lock 8454 1726882402.08867: ANSIBALLZ: Lock acquired: 140055527345136 8454 1726882402.08969: ANSIBALLZ: Creating module 8454 1726882402.62177: ANSIBALLZ: Writing module into payload 8454 1726882402.62578: ANSIBALLZ: Writing module 8454 1726882402.62582: ANSIBALLZ: Renaming module 8454 1726882402.62585: ANSIBALLZ: Done creating module 8454 1726882402.62588: variable 'ansible_facts' from source: unknown 8454 1726882402.62590: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882402.62593: _low_level_execute_command(): starting 8454 1726882402.62596: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 8454 1726882402.63246: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882402.63267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882402.63304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882402.63321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882402.63336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882402.63349: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882402.63360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882402.63377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882402.63388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882402.63468: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882402.63560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882402.63564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882402.63567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882402.63773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882402.65781: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 8454 1726882402.65953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882402.65971: stdout chunk (state=3): >>><<< 8454 1726882402.66027: stderr chunk (state=3): >>><<< 8454 1726882402.66147: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882402.66154 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 8454 1726882402.66314: _low_level_execute_command(): starting 8454 1726882402.66318: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 8454 1726882402.67060: Sending initial data 8454 1726882402.67063: Sent initial data (1181 bytes) 8454 1726882402.67895: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882402.67899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882402.67902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8454 1726882402.67905: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882402.67912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882402.68062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882402.68120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882402.68324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882402.73114: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 8454 1726882402.73532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882402.73547: stdout chunk (state=3): >>><<< 8454 1726882402.73560: stderr chunk (state=3): >>><<< 8454 1726882402.73942: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882402.73945: variable 'ansible_facts' from source: unknown 8454 1726882402.73953: variable 'ansible_facts' from source: unknown 8454 1726882402.73955: variable 'ansible_module_compression' from source: unknown 8454 1726882402.73958: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8454 1726882402.74340: variable 'ansible_facts' from source: unknown 8454 1726882402.74343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/AnsiballZ_setup.py 8454 1726882402.74667: Sending initial data 8454 1726882402.74677: Sent initial data (152 bytes) 8454 1726882402.75942: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882402.75957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8454 1726882402.76052: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882402.76090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882402.76115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882402.76133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882402.76365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882402.78053: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882402.78180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882402.78310: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpjqjbns40 /root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/AnsiballZ_setup.py <<< 8454 1726882402.78340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/AnsiballZ_setup.py" <<< 8454 1726882402.78477: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpjqjbns40" to remote "/root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/AnsiballZ_setup.py" <<< 8454 1726882402.81545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882402.81607: stderr chunk (state=3): >>><<< 8454 1726882402.81620: stdout chunk (state=3): >>><<< 8454 1726882402.81670: done transferring module to remote 8454 1726882402.81699: _low_level_execute_command(): starting 8454 1726882402.81710: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/ /root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/AnsiballZ_setup.py && sleep 0' 8454 1726882402.82407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882402.82444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882402.82519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882402.82573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882402.82599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882402.82623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882402.82767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882402.84771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882402.84799: stdout chunk (state=3): >>><<< 8454 1726882402.84802: stderr chunk (state=3): >>><<< 8454 1726882402.84823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882402.84927: _low_level_execute_command(): starting 8454 1726882402.84931: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/AnsiballZ_setup.py && sleep 0' 8454 1726882402.85535: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882402.85552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882402.85567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882402.85604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882402.85719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882402.85747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882402.85899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882402.88190: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 8454 1726882402.88220: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 8454 1726882402.88375: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 8454 1726882402.88382: stdout chunk (state=3): >>> import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 8454 1726882402.88462: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 8454 1726882402.88473: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882402.88490: stdout chunk (state=3): >>>import '_codecs' # <<< 8454 1726882402.88553: stdout chunk (state=3): >>>import 'codecs' # <<< 8454 1726882402.88672: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd920c530> <<< 8454 1726882402.88677: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd91dbb30> <<< 8454 1726882402.88712: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd920eab0> import '_signal' # import '_abc' # import 'abc' # <<< 8454 1726882402.88729: stdout chunk (state=3): >>>import 'io' # <<< 8454 1726882402.88756: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 8454 1726882402.88852: stdout chunk (state=3): >>>import '_collections_abc' # <<< 8454 1726882402.88918: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 8454 1726882402.88953: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 8454 1726882402.88976: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 8454 1726882402.89005: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 8454 1726882402.89009: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 8454 1726882402.89292: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9001160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9001fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 8454 1726882402.89916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 8454 1726882402.89927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 8454 1726882402.89965: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 8454 1726882402.89968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882402.90004: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8454 1726882402.90062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 8454 1726882402.90090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 8454 1726882402.90119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 8454 1726882402.90141: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903fdd0> <<< 8454 1726882402.90162: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 8454 1726882402.90192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 8454 1726882402.90226: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903ffe0> <<< 8454 1726882402.90292: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 8454 1726882402.90321: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 8454 1726882402.90403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882402.90430: stdout chunk (state=3): >>>import 'itertools' # <<< 8454 1726882402.90472: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9077800> <<< 8454 1726882402.90508: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9077e90> <<< 8454 1726882402.90600: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9057aa0> <<< 8454 1726882402.90618: stdout chunk (state=3): >>>import '_functools' # <<< 8454 1726882402.90669: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9055190> <<< 8454 1726882402.90819: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903cf80> <<< 8454 1726882402.90873: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 8454 1726882402.90897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 8454 1726882402.90921: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 8454 1726882402.90959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 8454 1726882402.91008: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 8454 1726882402.91011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 8454 1726882402.91058: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd909b710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd909a330> <<< 8454 1726882402.91102: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 8454 1726882402.91116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9056060> <<< 8454 1726882402.91181: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9098a40> <<< 8454 1726882402.91211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90cc6e0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903c200> <<< 8454 1726882402.91243: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 8454 1726882402.91327: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90ccb90> <<< 8454 1726882402.91393: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90cca40> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90cce00> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903ad20> <<< 8454 1726882402.91397: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882402.91445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 8454 1726882402.91488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90cd4c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90cd190> <<< 8454 1726882402.91502: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 8454 1726882402.91533: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 8454 1726882402.91580: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90ce3c0> import 'importlib.util' # <<< 8454 1726882402.91619: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 8454 1726882402.91668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 8454 1726882402.91696: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 8454 1726882402.91712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 8454 1726882402.91724: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90e85c0> import 'errno' # <<< 8454 1726882402.91761: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882402.91802: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90e9d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 8454 1726882402.91816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 8454 1726882402.91838: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 8454 1726882402.91870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90eabd0> <<< 8454 1726882402.91907: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882402.91930: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90eb230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90ea120> <<< 8454 1726882402.91949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 8454 1726882402.91971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 8454 1726882402.92026: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882402.92043: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90ebc80> <<< 8454 1726882402.92056: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90eb3b0> <<< 8454 1726882402.92141: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90ce3f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 8454 1726882402.92175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 8454 1726882402.92236: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 8454 1726882402.92278: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882402.92289: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8ddbb30> <<< 8454 1726882402.92317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 8454 1726882402.92361: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8e04620> <<< 8454 1726882402.92368: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e04380> <<< 8454 1726882402.92436: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8e04590> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8e047d0> <<< 8454 1726882402.92456: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8dd9cd0> <<< 8454 1726882402.92492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 8454 1726882402.92624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8454 1726882402.92663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 8454 1726882402.92676: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e05e50> <<< 8454 1726882402.92705: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e04b00> <<< 8454 1726882402.92726: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90ceae0> <<< 8454 1726882402.92757: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8454 1726882402.92820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882402.92836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8454 1726882402.92887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 8454 1726882402.92915: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e361e0> <<< 8454 1726882402.92973: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 8454 1726882402.93007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882402.93021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 8454 1726882402.93085: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e4e390> <<< 8454 1726882402.93110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8454 1726882402.93151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 8454 1726882402.93207: stdout chunk (state=3): >>>import 'ntpath' # <<< 8454 1726882402.93258: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e8b110> <<< 8454 1726882402.93265: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 8454 1726882402.93297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 8454 1726882402.93321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 8454 1726882402.93451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 8454 1726882402.93499: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8eb18b0> <<< 8454 1726882402.93547: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e8b230> <<< 8454 1726882402.93607: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e4f020> <<< 8454 1726882402.93670: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8cd81d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e4d3d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e06d80> <<< 8454 1726882402.94100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3dd8cd83b0> <<< 8454 1726882402.94189: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xoigeo1p/ansible_ansible.legacy.setup_payload.zip' <<< 8454 1726882402.94193: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882402.94481: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882402.94526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 8454 1726882402.94543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8454 1726882402.94615: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8454 1726882402.94742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8454 1726882402.94804: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 8454 1726882402.94808: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d3deb0> <<< 8454 1726882402.94835: stdout chunk (state=3): >>>import '_typing' # <<< 8454 1726882402.95321: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d14da0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8cdbec0> # zipimport: zlib available <<< 8454 1726882402.95325: stdout chunk (state=3): >>>import 'ansible' # <<< 8454 1726882402.95328: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8454 1726882402.95330: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882402.95389: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 8454 1726882402.97846: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.00003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d17d10> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882403.00042: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 8454 1726882403.00060: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 8454 1726882403.00305: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8d6d7f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d6d580> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d6cec0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d6d970> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d3e8d0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8d6e570> <<< 8454 1726882403.00328: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8d6e7b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 8454 1726882403.00393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 8454 1726882403.00396: stdout chunk (state=3): >>>import '_locale' # <<< 8454 1726882403.00439: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d6ecf0> <<< 8454 1726882403.00456: stdout chunk (state=3): >>>import 'pwd' # <<< 8454 1726882403.00490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 8454 1726882403.00714: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd4a10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8bd6630> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd6f60> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 8454 1726882403.00718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd7ef0> <<< 8454 1726882403.00731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 8454 1726882403.00761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 8454 1726882403.00791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 8454 1726882403.00832: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdabd0> <<< 8454 1726882403.00883: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8bdaf00> <<< 8454 1726882403.00914: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd8e90> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 8454 1726882403.00944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 8454 1726882403.01136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 8454 1726882403.01143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 8454 1726882403.01171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdeae0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdd5b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdd310> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 8454 1726882403.01187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 8454 1726882403.01249: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdfb90> <<< 8454 1726882403.01282: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd93a0> <<< 8454 1726882403.01529: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c22cc0> <<< 8454 1726882403.01533: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c22ea0> <<< 8454 1726882403.01550: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c24950> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c24710> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 8454 1726882403.01591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 8454 1726882403.01655: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.01659: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c26f00> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c25040> <<< 8454 1726882403.01675: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 8454 1726882403.01735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882403.01763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 8454 1726882403.01790: stdout chunk (state=3): >>>import '_string' # <<< 8454 1726882403.01808: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c32720> <<< 8454 1726882403.02066: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c270b0> <<< 8454 1726882403.02258: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c33a10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c338c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c33b60> <<< 8454 1726882403.02304: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c23020> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 8454 1726882403.02324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 8454 1726882403.02362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 8454 1726882403.02409: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.02573: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c36b10> <<< 8454 1726882403.02747: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.02773: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c37dd0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c352e0> <<< 8454 1726882403.02800: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c36180> <<< 8454 1726882403.02827: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c34e90> <<< 8454 1726882403.02855: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 8454 1726882403.02955: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.03030: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.03194: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.03218: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 8454 1726882403.03248: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 8454 1726882403.03313: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.03643: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.03741: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.04759: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.05874: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8ac1310> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ac0da0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c37f50> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 8454 1726882403.06094: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.06361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ac14f0> # zipimport: zlib available <<< 8454 1726882403.06949: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.07670: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.07723: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 8454 1726882403.07742: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.07798: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.07856: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 8454 1726882403.07870: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.08001: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.08178: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 8454 1726882403.08213: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.08240: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 8454 1726882403.08299: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.08368: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 8454 1726882403.08844: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.09327: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 8454 1726882403.09426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 8454 1726882403.09550: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ac3350> <<< 8454 1726882403.09651: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.09713: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.09839: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 8454 1726882403.09873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 8454 1726882403.10146: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.10150: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8ac99a0> <<< 8454 1726882403.10212: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.10219: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8aca300> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ac2360> <<< 8454 1726882403.10251: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.10311: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.10374: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 8454 1726882403.10380: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.10448: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.10518: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.10582: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.10678: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 8454 1726882403.10707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882403.10807: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.10810: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8ac9070> <<< 8454 1726882403.10857: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8aca390> <<< 8454 1726882403.10883: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 8454 1726882403.10956: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.11023: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.11060: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.11125: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882403.11147: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 8454 1726882403.11182: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 8454 1726882403.11229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 8454 1726882403.11243: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 8454 1726882403.11260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 8454 1726882403.11311: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b5e630> <<< 8454 1726882403.11373: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ad42f0> <<< 8454 1726882403.11756: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ace420> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ace270> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 8454 1726882403.11897: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.11970: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.12077: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.12137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 8454 1726882403.12141: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.12269: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.12390: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.12423: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.12459: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 8454 1726882403.12492: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.12837: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.13106: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.13270: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882403.13319: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 8454 1726882403.13382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 8454 1726882403.13462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b64b00><<< 8454 1726882403.13473: stdout chunk (state=3): >>> <<< 8454 1726882403.13495: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py<<< 8454 1726882403.13540: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 8454 1726882403.13582: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py<<< 8454 1726882403.13648: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc'<<< 8454 1726882403.13678: stdout chunk (state=3): >>> <<< 8454 1726882403.13730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py<<< 8454 1726882403.13736: stdout chunk (state=3): >>> <<< 8454 1726882403.13757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 8454 1726882403.13778: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80b3c80> <<< 8454 1726882403.13865: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so'<<< 8454 1726882403.13868: stdout chunk (state=3): >>> import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd80b3fb0><<< 8454 1726882403.13963: stdout chunk (state=3): >>> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8adcd10> <<< 8454 1726882403.14007: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8adc110> <<< 8454 1726882403.14069: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b66780> <<< 8454 1726882403.14107: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b66240> <<< 8454 1726882403.14147: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py<<< 8454 1726882403.14156: stdout chunk (state=3): >>> <<< 8454 1726882403.14257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py<<< 8454 1726882403.14276: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 8454 1726882403.14319: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py<<< 8454 1726882403.14340: stdout chunk (state=3): >>> <<< 8454 1726882403.14352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc'<<< 8454 1726882403.14406: stdout chunk (state=3): >>> # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so'<<< 8454 1726882403.14426: stdout chunk (state=3): >>> # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.14449: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd80cb0b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80ca960><<< 8454 1726882403.14501: stdout chunk (state=3): >>> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so'<<< 8454 1726882403.14506: stdout chunk (state=3): >>> <<< 8454 1726882403.14525: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so'<<< 8454 1726882403.14557: stdout chunk (state=3): >>> import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd80cab40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80c9d90> <<< 8454 1726882403.14599: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py<<< 8454 1726882403.14758: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc'<<< 8454 1726882403.14787: stdout chunk (state=3): >>> <<< 8454 1726882403.14790: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80cb110><<< 8454 1726882403.14832: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py<<< 8454 1726882403.14841: stdout chunk (state=3): >>> <<< 8454 1726882403.14900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc'<<< 8454 1726882403.14903: stdout chunk (state=3): >>> <<< 8454 1726882403.14950: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.14971: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.14982: stdout chunk (state=3): >>>import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8129c40><<< 8454 1726882403.14990: stdout chunk (state=3): >>> <<< 8454 1726882403.15043: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80cbc20><<< 8454 1726882403.15050: stdout chunk (state=3): >>> <<< 8454 1726882403.15097: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b66360><<< 8454 1726882403.15113: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.timeout' # <<< 8454 1726882403.15142: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.collector' # <<< 8454 1726882403.15171: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8454 1726882403.15208: stdout chunk (state=3): >>># zipimport: zlib available<<< 8454 1726882403.15225: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other' # <<< 8454 1726882403.15286: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8454 1726882403.15577: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 8454 1726882403.15581: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.15602: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.15691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 8454 1726882403.15725: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8454 1726882403.15802: stdout chunk (state=3): >>> <<< 8454 1726882403.15814: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 8454 1726882403.15911: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 8454 1726882403.15944: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.16245: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 8454 1726882403.16249: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.16290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 8454 1726882403.16351: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8454 1726882403.16482: stdout chunk (state=3): >>># zipimport: zlib available<<< 8454 1726882403.16571: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8454 1726882403.16659: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.16782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 8454 1726882403.16825: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.cmdline' # <<< 8454 1726882403.16829: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.17813: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.18699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 8454 1726882403.18720: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8454 1726882403.18769: stdout chunk (state=3): >>> <<< 8454 1726882403.18841: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.18957: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.19082: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # <<< 8454 1726882403.19086: stdout chunk (state=3): >>> <<< 8454 1726882403.19109: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 8454 1726882403.19274: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 8454 1726882403.19301: stdout chunk (state=3): >>># zipimport: zlib available<<< 8454 1726882403.19369: stdout chunk (state=3): >>> <<< 8454 1726882403.19415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 8454 1726882403.19447: stdout chunk (state=3): >>> # zipimport: zlib available<<< 8454 1726882403.19472: stdout chunk (state=3): >>> <<< 8454 1726882403.19524: stdout chunk (state=3): >>># zipimport: zlib available<<< 8454 1726882403.19530: stdout chunk (state=3): >>> <<< 8454 1726882403.19568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 8454 1726882403.19597: stdout chunk (state=3): >>> # zipimport: zlib available <<< 8454 1726882403.19690: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.19695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 8454 1726882403.19773: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.19849: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.20006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 8454 1726882403.20018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 8454 1726882403.20047: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd812af30> <<< 8454 1726882403.20074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 8454 1726882403.20126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 8454 1726882403.20514: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd812a480> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.20621: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 8454 1726882403.20626: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.20764: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.20915: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 8454 1726882403.20927: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.21045: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.21171: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 8454 1726882403.21189: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.21244: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.21332: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 8454 1726882403.21397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 8454 1726882403.21506: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.21615: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8155f10> <<< 8454 1726882403.21989: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8146bd0> import 'ansible.module_utils.facts.system.python' # <<< 8454 1726882403.22065: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.22105: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.22187: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 8454 1726882403.22218: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.22348: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.22495: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.22702: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.22966: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 8454 1726882403.23173: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.23249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 8454 1726882403.23309: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882403.23364: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd7f45c70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7f45bb0> import 'ansible.module_utils.facts.system.user' # <<< 8454 1726882403.23371: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.23416: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 8454 1726882403.23420: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.23483: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.23569: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 8454 1726882403.23575: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.23853: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.24147: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 8454 1726882403.24151: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.24337: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.24521: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.24582: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.24667: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 8454 1726882403.24687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 8454 1726882403.24868: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.25006: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.25255: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 8454 1726882403.25274: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.25497: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.25726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 8454 1726882403.25746: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.25791: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.25848: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.26898: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.27874: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 8454 1726882403.27878: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.28063: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.28259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 8454 1726882403.28262: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.28438: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.28613: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 8454 1726882403.28633: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.28904: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.29191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 8454 1726882403.29266: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 8454 1726882403.29312: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.29367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 8454 1726882403.29387: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.29553: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.29728: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.30113: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.30506: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 8454 1726882403.30673: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.30714: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 8454 1726882403.30735: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.30862: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.30990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 8454 1726882403.31015: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.31069: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 8454 1726882403.31073: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.31173: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.31271: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 8454 1726882403.31286: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.31373: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.31485: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 8454 1726882403.31492: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.31984: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.32477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 8454 1726882403.32503: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.32686: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.32690: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 8454 1726882403.32707: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.32751: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.32808: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 8454 1726882403.32825: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.32875: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.32925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 8454 1726882403.32940: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.32989: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.33041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 8454 1726882403.33065: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.33199: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.33464: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.33529: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 8454 1726882403.33533: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.33560: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.33585: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.33678: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.33752: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.33880: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.34030: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 8454 1726882403.34036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 8454 1726882403.34048: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 8454 1726882403.34124: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.34220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 8454 1726882403.34224: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.34593: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.34968: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 8454 1726882403.34983: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.35284: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882403.35298: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 8454 1726882403.35496: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.35602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 8454 1726882403.35644: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.35779: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.35923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 8454 1726882403.36164: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882403.36688: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 8454 1726882403.36820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd7f723c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7f70170> <<< 8454 1726882403.36874: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7f6fe00> <<< 8454 1726882403.52385: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7f73ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7fb8860> <<< 8454 1726882403.52461: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 8454 1726882403.52631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882403.52636: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7fb9f70> <<< 8454 1726882403.52639: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7fb98b0> <<< 8454 1726882403.53056: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 8454 1726882403.76124: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.47021484375, "5m": 0.30078125, "15m": 0.15869140625}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmV<<< 8454 1726882403.76443: stdout chunk (state=3): >>>ZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2904, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 813, "free": 2904}, "nocache": {"free": 3479, "used": 238}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_uuid": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 547, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251223314432, "block_size": 4096, "block_total": 64483404, "block_available": 61333817, "block_used": 3149587, "inode_total": 16384000, "inode_available": 16303858, "inode_used": 80142, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "23", "epoch": "1726882403", "epoch_int": "1726882403", "date": "2024-09-20", "time": "21:33:23", "iso8601_micro": "2024-09-21T01:33:23.730177Z", "iso8601": "2024-09-21T01:33:23Z", "iso8601_basic": "20240920T213323730177", "iso8601_basic_short": "20240920T213323", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::a0b7:fdc4:48e8:7158", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.41.238"], "ansible_all_ipv6_addresses": ["fe80::a0b7:fdc4:48e8:7158"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.41.238", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::a0b7:fdc4:48e8:7158"]}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8454 1726882403.76794: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8454 1726882403.76806: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 8454 1726882403.76876: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc <<< 8454 1726882403.76912: stdout chunk (state=3): >>># cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing <<< 8454 1726882403.77247: stdout chunk (state=3): >>># cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal <<< 8454 1726882403.77252: stdout chunk (state=3): >>># cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale <<< 8454 1726882403.77257: stdout chunk (state=3): >>># destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat <<< 8454 1726882403.77265: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues <<< 8454 1726882403.77439: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 8454 1726882403.77614: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 8454 1726882403.77751: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 8454 1726882403.77791: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 8454 1726882403.77818: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 8454 1726882403.77848: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 8454 1726882403.77953: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 8454 1726882403.77963: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 8454 1726882403.78006: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle <<< 8454 1726882403.78014: stdout chunk (state=3): >>># destroy queue # destroy _heapq <<< 8454 1726882403.78109: stdout chunk (state=3): >>># destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 8454 1726882403.78121: stdout chunk (state=3): >>># destroy _ssl <<< 8454 1726882403.78190: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct <<< 8454 1726882403.78201: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 8454 1726882403.78296: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep <<< 8454 1726882403.78350: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 8454 1726882403.78362: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 8454 1726882403.78503: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 8454 1726882403.78742: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8454 1726882403.78748: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 8454 1726882403.78829: stdout chunk (state=3): >>># destroy _collections <<< 8454 1726882403.78865: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 8454 1726882403.78874: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 8454 1726882403.78899: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 8454 1726882403.79131: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 8454 1726882403.79158: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 8454 1726882403.79614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882403.79621: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 8454 1726882403.79927: stderr chunk (state=3): >>><<< 8454 1726882403.79930: stdout chunk (state=3): >>><<< 8454 1726882403.80100: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd920c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd91dbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd920eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9001160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9001fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903fdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903ffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9077800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9077e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9057aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9055190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903cf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd909b710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd909a330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9056060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd9098a40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90cc6e0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903c200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90ccb90> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90cca40> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90cce00> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd903ad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90cd4c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90cd190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90ce3c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90e85c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90e9d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90eabd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90eb230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90ea120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd90ebc80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90eb3b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90ce3f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8ddbb30> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8e04620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e04380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8e04590> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8e047d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8dd9cd0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e05e50> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e04b00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd90ceae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e361e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e4e390> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e8b110> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8eb18b0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e8b230> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e4f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8cd81d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e4d3d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8e06d80> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3dd8cd83b0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xoigeo1p/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d3deb0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d14da0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8cdbec0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d17d10> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8d6d7f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d6d580> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d6cec0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d6d970> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d3e8d0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8d6e570> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8d6e7b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8d6ecf0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd4a10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8bd6630> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd6f60> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd7ef0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdabd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8bdaf00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd8e90> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdeae0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdd5b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdd310> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bdfb90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8bd93a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c22cc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c22ea0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c24950> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c24710> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c26f00> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c25040> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c32720> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c270b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c33a10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c338c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c33b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c23020> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c36b10> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c37dd0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c352e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8c36180> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c34e90> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8ac1310> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ac0da0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8c37f50> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ac14f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ac3350> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8ac99a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8aca300> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ac2360> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8ac9070> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8aca390> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b5e630> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ad42f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ace420> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8ace270> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b64b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80b3c80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd80b3fb0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8adcd10> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8adc110> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b66780> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b66240> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd80cb0b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80ca960> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd80cab40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80c9d90> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80cb110> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8129c40> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd80cbc20> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8b66360> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd812af30> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd812a480> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd8155f10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd8146bd0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd7f45c70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7f45bb0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3dd7f723c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7f70170> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7f6fe00> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7f73ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7fb8860> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7fb9f70> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3dd7fb98b0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.47021484375, "5m": 0.30078125, "15m": 0.15869140625}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2904, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 813, "free": 2904}, "nocache": {"free": 3479, "used": 238}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_uuid": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 547, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251223314432, "block_size": 4096, "block_total": 64483404, "block_available": 61333817, "block_used": 3149587, "inode_total": 16384000, "inode_available": 16303858, "inode_used": 80142, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "23", "epoch": "1726882403", "epoch_int": "1726882403", "date": "2024-09-20", "time": "21:33:23", "iso8601_micro": "2024-09-21T01:33:23.730177Z", "iso8601": "2024-09-21T01:33:23Z", "iso8601_basic": "20240920T213323730177", "iso8601_basic_short": "20240920T213323", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::a0b7:fdc4:48e8:7158", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.41.238"], "ansible_all_ipv6_addresses": ["fe80::a0b7:fdc4:48e8:7158"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.41.238", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::a0b7:fdc4:48e8:7158"]}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 8454 1726882403.84404: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882403.84409: _low_level_execute_command(): starting 8454 1726882403.84411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882402.0511794-8467-280613591612713/ > /dev/null 2>&1 && sleep 0' 8454 1726882403.85789: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882403.85799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882403.85810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882403.85993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882403.85997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882403.86049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882403.86341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882403.86345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882403.88642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882403.88647: stdout chunk (state=3): >>><<< 8454 1726882403.88655: stderr chunk (state=3): >>><<< 8454 1726882403.88680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882403.88694: handler run complete 8454 1726882403.89047: variable 'ansible_facts' from source: unknown 8454 1726882403.89335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882403.90346: variable 'ansible_facts' from source: unknown 8454 1726882403.90644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882403.90889: attempt loop complete, returning result 8454 1726882403.91008: _execute() done 8454 1726882403.91012: dumping result to json 8454 1726882403.91044: done dumping result, returning 8454 1726882403.91055: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affe814-3a2d-f59f-16b9-0000000000cc] 8454 1726882403.91060: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000cc 8454 1726882403.91705: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000cc 8454 1726882403.91709: WORKER PROCESS EXITING ok: [managed_node3] 8454 1726882403.93182: no more pending results, returning what we have 8454 1726882403.93185: results queue empty 8454 1726882403.93186: checking for any_errors_fatal 8454 1726882403.93188: done checking for any_errors_fatal 8454 1726882403.93189: checking for max_fail_percentage 8454 1726882403.93191: done checking for max_fail_percentage 8454 1726882403.93192: checking to see if all hosts have failed and the running result is not ok 8454 1726882403.93193: done checking to see if all hosts have failed 8454 1726882403.93194: getting the remaining hosts for this loop 8454 1726882403.93195: done getting the remaining hosts for this loop 8454 1726882403.93199: getting the next task for host managed_node3 8454 1726882403.93206: done getting next task for host managed_node3 8454 1726882403.93208: ^ task is: TASK: meta (flush_handlers) 8454 1726882403.93211: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882403.93216: getting variables 8454 1726882403.93217: in VariableManager get_vars() 8454 1726882403.93373: Calling all_inventory to load vars for managed_node3 8454 1726882403.93376: Calling groups_inventory to load vars for managed_node3 8454 1726882403.93383: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882403.93394: Calling all_plugins_play to load vars for managed_node3 8454 1726882403.93397: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882403.93401: Calling groups_plugins_play to load vars for managed_node3 8454 1726882403.93865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882403.94587: done with get_vars() 8454 1726882403.94600: done getting variables 8454 1726882403.94803: in VariableManager get_vars() 8454 1726882403.94814: Calling all_inventory to load vars for managed_node3 8454 1726882403.94817: Calling groups_inventory to load vars for managed_node3 8454 1726882403.94820: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882403.94826: Calling all_plugins_play to load vars for managed_node3 8454 1726882403.94829: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882403.94832: Calling groups_plugins_play to load vars for managed_node3 8454 1726882403.95275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882403.95576: done with get_vars() 8454 1726882403.95595: done queuing things up, now waiting for results queue to drain 8454 1726882403.95598: results queue empty 8454 1726882403.95599: checking for any_errors_fatal 8454 1726882403.95602: done checking for any_errors_fatal 8454 1726882403.95603: checking for max_fail_percentage 8454 1726882403.95604: done checking for max_fail_percentage 8454 1726882403.95609: checking to see if all hosts have failed and the running result is not ok 8454 1726882403.95610: done checking to see if all hosts have failed 8454 1726882403.95612: getting the remaining hosts for this loop 8454 1726882403.95613: done getting the remaining hosts for this loop 8454 1726882403.95616: getting the next task for host managed_node3 8454 1726882403.95622: done getting next task for host managed_node3 8454 1726882403.95625: ^ task is: TASK: Include the task 'el_repo_setup.yml' 8454 1726882403.95627: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882403.95629: getting variables 8454 1726882403.95630: in VariableManager get_vars() 8454 1726882403.95642: Calling all_inventory to load vars for managed_node3 8454 1726882403.95645: Calling groups_inventory to load vars for managed_node3 8454 1726882403.95653: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882403.95658: Calling all_plugins_play to load vars for managed_node3 8454 1726882403.95661: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882403.95665: Calling groups_plugins_play to load vars for managed_node3 8454 1726882403.95858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882403.96130: done with get_vars() 8454 1726882403.96141: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Friday 20 September 2024 21:33:23 -0400 (0:00:01.966) 0:00:01.980 ****** 8454 1726882403.96241: entering _queue_task() for managed_node3/include_tasks 8454 1726882403.96243: Creating lock for include_tasks 8454 1726882403.96674: worker is 1 (out of 1 available) 8454 1726882403.96689: exiting _queue_task() for managed_node3/include_tasks 8454 1726882403.96702: done queuing things up, now waiting for results queue to drain 8454 1726882403.96704: waiting for pending results... 8454 1726882403.97201: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 8454 1726882403.97207: in run() - task 0affe814-3a2d-f59f-16b9-000000000006 8454 1726882403.97211: variable 'ansible_search_path' from source: unknown 8454 1726882403.97214: calling self._execute() 8454 1726882403.97246: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882403.97259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882403.97277: variable 'omit' from source: magic vars 8454 1726882403.97486: _execute() done 8454 1726882403.97497: dumping result to json 8454 1726882403.97562: done dumping result, returning 8454 1726882403.97566: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0affe814-3a2d-f59f-16b9-000000000006] 8454 1726882403.97834: sending task result for task 0affe814-3a2d-f59f-16b9-000000000006 8454 1726882403.97916: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000006 8454 1726882403.97920: WORKER PROCESS EXITING 8454 1726882403.97986: no more pending results, returning what we have 8454 1726882403.97992: in VariableManager get_vars() 8454 1726882403.98028: Calling all_inventory to load vars for managed_node3 8454 1726882403.98241: Calling groups_inventory to load vars for managed_node3 8454 1726882403.98245: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882403.98256: Calling all_plugins_play to load vars for managed_node3 8454 1726882403.98260: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882403.98264: Calling groups_plugins_play to load vars for managed_node3 8454 1726882403.98790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882403.99412: done with get_vars() 8454 1726882403.99420: variable 'ansible_search_path' from source: unknown 8454 1726882403.99436: we have included files to process 8454 1726882403.99437: generating all_blocks data 8454 1726882403.99439: done generating all_blocks data 8454 1726882403.99440: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8454 1726882403.99441: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8454 1726882403.99444: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8454 1726882404.00945: in VariableManager get_vars() 8454 1726882404.00964: done with get_vars() 8454 1726882404.00981: done processing included file 8454 1726882404.00984: iterating over new_blocks loaded from include file 8454 1726882404.00986: in VariableManager get_vars() 8454 1726882404.00997: done with get_vars() 8454 1726882404.00998: filtering new block on tags 8454 1726882404.01136: done filtering new block on tags 8454 1726882404.01141: in VariableManager get_vars() 8454 1726882404.01175: done with get_vars() 8454 1726882404.01177: filtering new block on tags 8454 1726882404.01202: done filtering new block on tags 8454 1726882404.01205: in VariableManager get_vars() 8454 1726882404.01217: done with get_vars() 8454 1726882404.01333: filtering new block on tags 8454 1726882404.01357: done filtering new block on tags 8454 1726882404.01360: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 8454 1726882404.01366: extending task lists for all hosts with included blocks 8454 1726882404.01550: done extending task lists 8454 1726882404.01552: done processing included files 8454 1726882404.01553: results queue empty 8454 1726882404.01554: checking for any_errors_fatal 8454 1726882404.01556: done checking for any_errors_fatal 8454 1726882404.01557: checking for max_fail_percentage 8454 1726882404.01558: done checking for max_fail_percentage 8454 1726882404.01559: checking to see if all hosts have failed and the running result is not ok 8454 1726882404.01560: done checking to see if all hosts have failed 8454 1726882404.01561: getting the remaining hosts for this loop 8454 1726882404.01563: done getting the remaining hosts for this loop 8454 1726882404.01566: getting the next task for host managed_node3 8454 1726882404.01571: done getting next task for host managed_node3 8454 1726882404.01574: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 8454 1726882404.01577: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882404.01582: getting variables 8454 1726882404.01583: in VariableManager get_vars() 8454 1726882404.01592: Calling all_inventory to load vars for managed_node3 8454 1726882404.01595: Calling groups_inventory to load vars for managed_node3 8454 1726882404.01598: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882404.01604: Calling all_plugins_play to load vars for managed_node3 8454 1726882404.01607: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882404.01611: Calling groups_plugins_play to load vars for managed_node3 8454 1726882404.02024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882404.02354: done with get_vars() 8454 1726882404.02364: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:33:24 -0400 (0:00:00.062) 0:00:02.042 ****** 8454 1726882404.02458: entering _queue_task() for managed_node3/setup 8454 1726882404.02792: worker is 1 (out of 1 available) 8454 1726882404.02803: exiting _queue_task() for managed_node3/setup 8454 1726882404.02814: done queuing things up, now waiting for results queue to drain 8454 1726882404.02816: waiting for pending results... 8454 1726882404.03095: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 8454 1726882404.03163: in run() - task 0affe814-3a2d-f59f-16b9-0000000000dd 8454 1726882404.03193: variable 'ansible_search_path' from source: unknown 8454 1726882404.03201: variable 'ansible_search_path' from source: unknown 8454 1726882404.03300: calling self._execute() 8454 1726882404.03342: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882404.03355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882404.03372: variable 'omit' from source: magic vars 8454 1726882404.04343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882404.07928: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882404.08040: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882404.08101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882404.08151: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882404.08197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882404.08305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882404.08352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882404.08430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882404.08460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882404.08484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882404.08725: variable 'ansible_facts' from source: unknown 8454 1726882404.08823: variable 'network_test_required_facts' from source: task vars 8454 1726882404.08885: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 8454 1726882404.08898: variable 'omit' from source: magic vars 8454 1726882404.08952: variable 'omit' from source: magic vars 8454 1726882404.09009: variable 'omit' from source: magic vars 8454 1726882404.09046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882404.09092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882404.09119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882404.09198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882404.09202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882404.09211: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882404.09220: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882404.09228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882404.09367: Set connection var ansible_connection to ssh 8454 1726882404.09387: Set connection var ansible_shell_executable to /bin/sh 8454 1726882404.09399: Set connection var ansible_timeout to 10 8454 1726882404.09412: Set connection var ansible_shell_type to sh 8454 1726882404.09432: Set connection var ansible_pipelining to False 8454 1726882404.09523: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882404.09528: variable 'ansible_shell_executable' from source: unknown 8454 1726882404.09531: variable 'ansible_connection' from source: unknown 8454 1726882404.09533: variable 'ansible_module_compression' from source: unknown 8454 1726882404.09535: variable 'ansible_shell_type' from source: unknown 8454 1726882404.09539: variable 'ansible_shell_executable' from source: unknown 8454 1726882404.09541: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882404.09543: variable 'ansible_pipelining' from source: unknown 8454 1726882404.09545: variable 'ansible_timeout' from source: unknown 8454 1726882404.09547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882404.09771: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882404.09775: variable 'omit' from source: magic vars 8454 1726882404.09778: starting attempt loop 8454 1726882404.09783: running the handler 8454 1726882404.09788: _low_level_execute_command(): starting 8454 1726882404.09802: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882404.10657: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.10753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882404.10775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882404.10815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882404.11054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882404.13492: stdout chunk (state=3): >>>/root <<< 8454 1726882404.13786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882404.14181: stdout chunk (state=3): >>><<< 8454 1726882404.14185: stderr chunk (state=3): >>><<< 8454 1726882404.14188: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882404.14199: _low_level_execute_command(): starting 8454 1726882404.14202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739 `" && echo ansible-tmp-1726882404.1407347-8564-196702054007739="` echo /root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739 `" ) && sleep 0' 8454 1726882404.15297: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882404.15313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882404.15326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882404.15450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882404.15531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.15649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882404.15660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882404.15812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882404.18739: stdout chunk (state=3): >>>ansible-tmp-1726882404.1407347-8564-196702054007739=/root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739 <<< 8454 1726882404.19005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882404.19008: stdout chunk (state=3): >>><<< 8454 1726882404.19011: stderr chunk (state=3): >>><<< 8454 1726882404.19032: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882404.1407347-8564-196702054007739=/root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882404.19129: variable 'ansible_module_compression' from source: unknown 8454 1726882404.19222: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8454 1726882404.19293: variable 'ansible_facts' from source: unknown 8454 1726882404.19516: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/AnsiballZ_setup.py 8454 1726882404.19759: Sending initial data 8454 1726882404.19763: Sent initial data (152 bytes) 8454 1726882404.20347: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882404.20363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882404.20383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882404.20409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882404.20516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.20539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882404.20556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882404.20578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882404.20725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882404.23252: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882404.23425: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882404.23547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpuvwosqbf /root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/AnsiballZ_setup.py <<< 8454 1726882404.23551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/AnsiballZ_setup.py" <<< 8454 1726882404.23740: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpuvwosqbf" to remote "/root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/AnsiballZ_setup.py" <<< 8454 1726882404.27630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882404.27664: stderr chunk (state=3): >>><<< 8454 1726882404.27668: stdout chunk (state=3): >>><<< 8454 1726882404.27701: done transferring module to remote 8454 1726882404.27714: _low_level_execute_command(): starting 8454 1726882404.27719: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/ /root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/AnsiballZ_setup.py && sleep 0' 8454 1726882404.28205: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882404.28208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.28211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882404.28213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.28271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882404.28274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882404.28389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882404.30676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882404.30738: stderr chunk (state=3): >>><<< 8454 1726882404.30741: stdout chunk (state=3): >>><<< 8454 1726882404.30795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882404.30799: _low_level_execute_command(): starting 8454 1726882404.30802: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/AnsiballZ_setup.py && sleep 0' 8454 1726882404.31209: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882404.31213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882404.31215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.31218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882404.31220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.31289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882404.31291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882404.31412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882404.34321: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 8454 1726882404.34465: stdout chunk (state=3): >>>import 'posix' # <<< 8454 1726882404.34618: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d3004530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2fd3b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 8454 1726882404.34652: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d3006ab0> import '_signal' # <<< 8454 1726882404.34704: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 8454 1726882404.34727: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 8454 1726882404.35064: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2db9160> <<< 8454 1726882404.35072: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.35098: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2db9fd0> <<< 8454 1726882404.35124: stdout chunk (state=3): >>>import 'site' # <<< 8454 1726882404.35310: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 8454 1726882404.35559: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 8454 1726882404.35591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8454 1726882404.35650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 8454 1726882404.35710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 8454 1726882404.35799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 8454 1726882404.35823: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df7f50> <<< 8454 1726882404.35827: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 8454 1726882404.36018: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.36022: stdout chunk (state=3): >>>import 'itertools' # <<< 8454 1726882404.36026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e2f860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e2fef0> import '_collections' # <<< 8454 1726882404.36052: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e0fb30> import '_functools' # <<< 8454 1726882404.36076: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e0d1f0> <<< 8454 1726882404.36193: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df5040> <<< 8454 1726882404.36352: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 8454 1726882404.36384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e53800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e52420> <<< 8454 1726882404.36406: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e0e2a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e50c80> <<< 8454 1726882404.36476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 8454 1726882404.36497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e84800> <<< 8454 1726882404.36587: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df42c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e84cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e84b60> <<< 8454 1726882404.36705: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e84f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df2de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.36739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e85640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e85310> import 'importlib.machinery' # <<< 8454 1726882404.36790: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e86540> <<< 8454 1726882404.36866: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 8454 1726882404.36991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e9c770> import 'errno' # <<< 8454 1726882404.36995: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e9deb0> <<< 8454 1726882404.37038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e9ed80> <<< 8454 1726882404.37288: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e9f3b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e9e2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 8454 1726882404.37292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 8454 1726882404.37337: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e9fe30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e9f560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e865a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2b93cb0> <<< 8454 1726882404.37367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2bbc7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bbc530> <<< 8454 1726882404.37439: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2bbc6e0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882404.37446: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2bbc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b91e50> <<< 8454 1726882404.37467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 8454 1726882404.37568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8454 1726882404.37606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 8454 1726882404.37642: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bbe030> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bbccb0> <<< 8454 1726882404.37666: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e86c90> <<< 8454 1726882404.37682: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8454 1726882404.37742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.37787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8454 1726882404.37823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 8454 1726882404.37885: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bee3c0> <<< 8454 1726882404.37924: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.37976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 8454 1726882404.37996: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c06510> <<< 8454 1726882404.38012: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8454 1726882404.38054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 8454 1726882404.38145: stdout chunk (state=3): >>>import 'ntpath' # <<< 8454 1726882404.38149: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c3f2c0> <<< 8454 1726882404.38178: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 8454 1726882404.38197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 8454 1726882404.38530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c65a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c3f3e0> <<< 8454 1726882404.38569: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c071a0> <<< 8454 1726882404.38726: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2a80380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c05550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bbef60> <<< 8454 1726882404.38905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 8454 1726882404.38936: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb1d2a80560> <<< 8454 1726882404.39376: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_o9rx_zjr/ansible_setup_payload.zip' # zipimport: zlib available <<< 8454 1726882404.39545: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.39606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 8454 1726882404.39623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8454 1726882404.39692: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8454 1726882404.39813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8454 1726882404.39858: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 8454 1726882404.39880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2aea060> <<< 8454 1726882404.39897: stdout chunk (state=3): >>>import '_typing' # <<< 8454 1726882404.40238: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2ac0f50> <<< 8454 1726882404.40469: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2ac00b0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 8454 1726882404.42491: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.44409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 8454 1726882404.44455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2ac3ef0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.44478: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 8454 1726882404.44511: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 8454 1726882404.44568: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882404.44580: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2b199a0> <<< 8454 1726882404.44627: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b19730> <<< 8454 1726882404.44677: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b19040> <<< 8454 1726882404.44708: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 8454 1726882404.44792: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b19a90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2aeaa80> <<< 8454 1726882404.44795: stdout chunk (state=3): >>>import 'atexit' # <<< 8454 1726882404.44825: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2b1a750> <<< 8454 1726882404.44893: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2b1a990> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 8454 1726882404.44966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 8454 1726882404.44987: stdout chunk (state=3): >>>import '_locale' # <<< 8454 1726882404.45062: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b1aed0> import 'pwd' # <<< 8454 1726882404.45083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 8454 1726882404.45167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2980c20> <<< 8454 1726882404.45217: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882404.45254: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2982840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 8454 1726882404.45287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 8454 1726882404.45350: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2983200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 8454 1726882404.45382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 8454 1726882404.45413: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29843e0> <<< 8454 1726882404.45564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 8454 1726882404.45740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2986e10> <<< 8454 1726882404.45776: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2986f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29850d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 8454 1726882404.45806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 8454 1726882404.45827: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d298acf0> import '_tokenize' # <<< 8454 1726882404.45908: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29897c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2989520> <<< 8454 1726882404.45939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 8454 1726882404.46021: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d298bc50> <<< 8454 1726882404.46117: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29855e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29cee40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29cefc0> <<< 8454 1726882404.46161: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 8454 1726882404.46304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 8454 1726882404.46308: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29d0b90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29d0950> <<< 8454 1726882404.46333: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 8454 1726882404.46382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 8454 1726882404.46449: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29d3110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29d1280> <<< 8454 1726882404.46523: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 8454 1726882404.46557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 8454 1726882404.46659: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29de840> <<< 8454 1726882404.46775: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29d31d0> <<< 8454 1726882404.46848: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29dfb30> <<< 8454 1726882404.46904: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29df9e0> <<< 8454 1726882404.47003: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29dfc80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29cf290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 8454 1726882404.47038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 8454 1726882404.47099: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882404.47113: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29e3380> <<< 8454 1726882404.47433: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29e45c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29e1af0> <<< 8454 1726882404.47454: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29e2e70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29e16d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 8454 1726882404.47522: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.47630: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.47684: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 8454 1726882404.47752: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.47887: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.48115: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.48673: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.49352: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 8454 1726882404.49374: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 8454 1726882404.49397: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 8454 1726882404.49433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.49668: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d286c860> <<< 8454 1726882404.49685: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d286d760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29e7aa0> import 'ansible.module_utils.compat.selinux' # <<< 8454 1726882404.49708: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.49731: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.49768: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 8454 1726882404.49931: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.50174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 8454 1726882404.50177: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d286dc10> <<< 8454 1726882404.50181: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.50757: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.51317: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.51388: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.51484: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 8454 1726882404.51560: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.51566: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.51645: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 8454 1726882404.51671: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.51885: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 8454 1726882404.51888: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.51901: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.51929: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 8454 1726882404.51951: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.52232: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.52644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 8454 1726882404.52678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 8454 1726882404.52693: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d286fa10> <<< 8454 1726882404.52718: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.52789: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.52879: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 8454 1726882404.52923: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 8454 1726882404.52937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 8454 1726882404.53168: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d28762a0> <<< 8454 1726882404.53203: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2876ba0> <<< 8454 1726882404.53228: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29e7a10> # zipimport: zlib available <<< 8454 1726882404.53286: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.53330: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 8454 1726882404.53350: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.53390: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.53445: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.53502: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.53590: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 8454 1726882404.53618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.53718: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2875880> <<< 8454 1726882404.53758: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2876cf0> <<< 8454 1726882404.53821: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 8454 1726882404.53871: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.53950: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.54124: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.54128: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 8454 1726882404.54159: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 8454 1726882404.54163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 8454 1726882404.54176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 8454 1726882404.54244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 8454 1726882404.54255: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2906f30> <<< 8454 1726882404.54303: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2883da0> <<< 8454 1726882404.54390: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d287adb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d287aba0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 8454 1726882404.54407: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.54469: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 8454 1726882404.54756: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.54820: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.54848: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.54939: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.54943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 8454 1726882404.54954: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.55105: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.55126: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.55148: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.55165: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 8454 1726882404.55240: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.55453: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.55583: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.55618: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.55690: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882404.55754: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 8454 1726882404.55769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 8454 1726882404.55798: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d290db20> <<< 8454 1726882404.55819: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 8454 1726882404.55832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 8454 1726882404.55908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 8454 1726882404.55916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 8454 1726882404.56157: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 8454 1726882404.56161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 8454 1726882404.56163: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2348440> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2348740> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d28f54c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d28f4890> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d290c200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d290cc80> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 8454 1726882404.56204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 8454 1726882404.56222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 8454 1726882404.56243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 8454 1726882404.56392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 8454 1726882404.56406: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d234b800> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d234b0b0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d234b290> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d234a4e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 8454 1726882404.56510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d234b9e0> <<< 8454 1726882404.56628: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d23b2510> <<< 8454 1726882404.56672: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23b0530> <<< 8454 1726882404.56676: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d290d2e0> <<< 8454 1726882404.56679: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 8454 1726882404.56717: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 8454 1726882404.56721: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.56757: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.56760: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 8454 1726882404.56851: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.56937: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 8454 1726882404.56956: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.57010: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 8454 1726882404.57056: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 8454 1726882404.57092: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.57163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 8454 1726882404.57199: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.57487: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.57556: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.57617: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 8454 1726882404.57704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 8454 1726882404.58187: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.58675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 8454 1726882404.58744: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.58747: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.58842: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.58857: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.58911: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 8454 1726882404.58933: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.59014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 8454 1726882404.59041: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.59112: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 8454 1726882404.59132: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.59232: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.59284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 8454 1726882404.59288: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.59445: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.59473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 8454 1726882404.59477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 8454 1726882404.59511: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23b2a20> <<< 8454 1726882404.59563: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 8454 1726882404.59677: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23b34a0> import 'ansible.module_utils.facts.system.local' # <<< 8454 1726882404.59707: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.59819: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.59890: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 8454 1726882404.59957: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.60106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 8454 1726882404.60144: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.60229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 8454 1726882404.60232: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.60331: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.60337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 8454 1726882404.60385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 8454 1726882404.60587: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d23eaa50> <<< 8454 1726882404.60762: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23d4c20> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 8454 1726882404.60808: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.61052: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.61072: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.61202: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.61359: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 8454 1726882404.61386: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.61555: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.61583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 8454 1726882404.61611: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882404.61633: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d1cf6030> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23e8740> <<< 8454 1726882404.61650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 8454 1726882404.61670: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.61685: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 8454 1726882404.61852: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 8454 1726882404.61984: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.62160: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 8454 1726882404.62251: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.62449: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.62505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 8454 1726882404.62509: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 8454 1726882404.62512: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.62521: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.62548: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.62716: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.62873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 8454 1726882404.62930: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.63035: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.63175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 8454 1726882404.63178: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.63213: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.63452: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.63882: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.64456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 8454 1726882404.64471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 8454 1726882404.64592: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.64708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 8454 1726882404.64963: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.64966: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 8454 1726882404.65125: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.65304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 8454 1726882404.65326: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.65347: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 8454 1726882404.65360: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.65394: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.65446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 8454 1726882404.65459: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.65572: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.65681: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.65914: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.66142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 8454 1726882404.66157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 8454 1726882404.66169: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.66352: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 8454 1726882404.66405: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.66479: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 8454 1726882404.66496: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.66515: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.66546: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 8454 1726882404.66558: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.66625: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.66690: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 8454 1726882404.66851: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 8454 1726882404.67141: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.67662: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.67669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 8454 1726882404.67686: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.67714: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.67752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 8454 1726882404.67765: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.67802: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.67951: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.68029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 8454 1726882404.68048: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.68066: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 8454 1726882404.68255: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.68295: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.68354: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.68426: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.68514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 8454 1726882404.68533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 8454 1726882404.68548: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.68598: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.68752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 8454 1726882404.68894: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.69114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 8454 1726882404.69153: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.69172: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.69226: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 8454 1726882404.69354: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.69457: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882404.69538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 8454 1726882404.69552: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.69660: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.69756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 8454 1726882404.69769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 8454 1726882404.69952: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882404.70377: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 8454 1726882404.70381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 8454 1726882404.70400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 8454 1726882404.70411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 8454 1726882404.70456: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882404.70470: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d1d232c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d1d210a0> <<< 8454 1726882404.70565: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d1d204a0> <<< 8454 1726882404.71337: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "24", "epoch": "1726882404", "epoch_int": "1726882404", "date": "2024-09-20", "time": "21:33:24", "iso8601_micro": "2024-09-21T01:33:24.706210Z", "iso8601": "2024-09-21T01:33:24Z", "iso8601_basic": "20240920T213324706210", "iso8601_basic_short": "20240920T213324", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZ<<< 8454 1726882404.71352: stdout chunk (state=3): >>>hIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_local": {}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8454 1726882404.71945: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8454 1726882404.71992: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 8454 1726882404.72017: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct <<< 8454 1726882404.72157: stdout chunk (state=3): >>># cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation <<< 8454 1726882404.72272: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys <<< 8454 1726882404.72283: stdout chunk (state=3): >>># cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd <<< 8454 1726882404.72286: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd<<< 8454 1726882404.72289: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux <<< 8454 1726882404.72293: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other <<< 8454 1726882404.72450: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 8454 1726882404.72670: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 8454 1726882404.72690: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 8454 1726882404.72759: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 8454 1726882404.72957: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 8454 1726882404.73024: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 8454 1726882404.73030: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 8454 1726882404.73075: stdout chunk (state=3): >>># destroy _pickle # destroy queue <<< 8454 1726882404.73078: stdout chunk (state=3): >>># destroy _heapq <<< 8454 1726882404.73080: stdout chunk (state=3): >>># destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util <<< 8454 1726882404.73252: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 8454 1726882404.73279: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser <<< 8454 1726882404.73303: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves <<< 8454 1726882404.73552: stdout chunk (state=3): >>># destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8454 1726882404.73711: stdout chunk (state=3): >>># destroy sys.monitoring <<< 8454 1726882404.73718: stdout chunk (state=3): >>># destroy _socket <<< 8454 1726882404.73723: stdout chunk (state=3): >>># destroy _collections <<< 8454 1726882404.73739: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 8454 1726882404.73821: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 8454 1726882404.73841: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 8454 1726882404.74054: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 8454 1726882404.74139: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 8454 1726882404.74517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882404.74530: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 8454 1726882404.74632: stderr chunk (state=3): >>><<< 8454 1726882404.74748: stdout chunk (state=3): >>><<< 8454 1726882404.74968: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d3004530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2fd3b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d3006ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2db9160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2db9fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df7f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e2f860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e2fef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e0fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e0d1f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df5040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e53800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e52420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e0e2a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e50c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e84800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df42c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e84cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e84b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e84f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2df2de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e85640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e85310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e86540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e9c770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e9deb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e9ed80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e9f3b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e9e2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2e9fe30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e9f560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e865a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2b93cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2bbc7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bbc530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2bbc6e0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2bbc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b91e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bbe030> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bbccb0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2e86c90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bee3c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c06510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c3f2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c65a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c3f3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c071a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2a80380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2c05550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2bbef60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb1d2a80560> # zipimport: found 103 names in '/tmp/ansible_setup_payload_o9rx_zjr/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2aea060> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2ac0f50> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2ac00b0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2ac3ef0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2b199a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b19730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b19040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b19a90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2aeaa80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2b1a750> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2b1a990> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2b1aed0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2980c20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2982840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2983200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29843e0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2986e10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2986f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29850d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d298acf0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29897c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2989520> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d298bc50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29855e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29cee40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29cefc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29d0b90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29d0950> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29d3110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29d1280> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29de840> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29d31d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29dfb30> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29df9e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29dfc80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29cf290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29e3380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29e45c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29e1af0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d29e2e70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29e16d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d286c860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d286d760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29e7aa0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d286dc10> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d286fa10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d28762a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2876ba0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d29e7a10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2875880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2876cf0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2906f30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2883da0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d287adb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d287aba0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d290db20> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d2348440> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d2348740> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d28f54c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d28f4890> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d290c200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d290cc80> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d234b800> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d234b0b0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d234b290> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d234a4e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d234b9e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d23b2510> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23b0530> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d290d2e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23b2a20> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23b34a0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d23eaa50> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23d4c20> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d1cf6030> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d23e8740> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb1d1d232c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d1d210a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb1d1d204a0> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "24", "epoch": "1726882404", "epoch_int": "1726882404", "date": "2024-09-20", "time": "21:33:24", "iso8601_micro": "2024-09-21T01:33:24.706210Z", "iso8601": "2024-09-21T01:33:24Z", "iso8601_basic": "20240920T213324706210", "iso8601_basic_short": "20240920T213324", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_local": {}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 8454 1726882404.76735: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882404.76825: _low_level_execute_command(): starting 8454 1726882404.76828: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882404.1407347-8564-196702054007739/ > /dev/null 2>&1 && sleep 0' 8454 1726882404.76830: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882404.76854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882404.76959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.77000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882404.77022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882404.77048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882404.77244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882404.79783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882404.79845: stderr chunk (state=3): >>><<< 8454 1726882404.79854: stdout chunk (state=3): >>><<< 8454 1726882404.79868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882404.79876: handler run complete 8454 1726882404.79929: variable 'ansible_facts' from source: unknown 8454 1726882404.80047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882404.80388: variable 'ansible_facts' from source: unknown 8454 1726882404.80471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882404.80588: attempt loop complete, returning result 8454 1726882404.80739: _execute() done 8454 1726882404.80742: dumping result to json 8454 1726882404.80745: done dumping result, returning 8454 1726882404.80747: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affe814-3a2d-f59f-16b9-0000000000dd] 8454 1726882404.80749: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000dd 8454 1726882404.81273: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000dd 8454 1726882404.81277: WORKER PROCESS EXITING ok: [managed_node3] 8454 1726882404.81476: no more pending results, returning what we have 8454 1726882404.81481: results queue empty 8454 1726882404.81482: checking for any_errors_fatal 8454 1726882404.81484: done checking for any_errors_fatal 8454 1726882404.81485: checking for max_fail_percentage 8454 1726882404.81487: done checking for max_fail_percentage 8454 1726882404.81488: checking to see if all hosts have failed and the running result is not ok 8454 1726882404.81489: done checking to see if all hosts have failed 8454 1726882404.81490: getting the remaining hosts for this loop 8454 1726882404.81491: done getting the remaining hosts for this loop 8454 1726882404.81495: getting the next task for host managed_node3 8454 1726882404.81505: done getting next task for host managed_node3 8454 1726882404.81508: ^ task is: TASK: Check if system is ostree 8454 1726882404.81511: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882404.81514: getting variables 8454 1726882404.81516: in VariableManager get_vars() 8454 1726882404.81547: Calling all_inventory to load vars for managed_node3 8454 1726882404.81551: Calling groups_inventory to load vars for managed_node3 8454 1726882404.81555: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882404.81743: Calling all_plugins_play to load vars for managed_node3 8454 1726882404.81747: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882404.81751: Calling groups_plugins_play to load vars for managed_node3 8454 1726882404.82022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882404.82345: done with get_vars() 8454 1726882404.82356: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:33:24 -0400 (0:00:00.800) 0:00:02.842 ****** 8454 1726882404.82466: entering _queue_task() for managed_node3/stat 8454 1726882404.82747: worker is 1 (out of 1 available) 8454 1726882404.82760: exiting _queue_task() for managed_node3/stat 8454 1726882404.82775: done queuing things up, now waiting for results queue to drain 8454 1726882404.82781: waiting for pending results... 8454 1726882404.82959: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 8454 1726882404.83048: in run() - task 0affe814-3a2d-f59f-16b9-0000000000df 8454 1726882404.83060: variable 'ansible_search_path' from source: unknown 8454 1726882404.83064: variable 'ansible_search_path' from source: unknown 8454 1726882404.83104: calling self._execute() 8454 1726882404.83167: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882404.83174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882404.83188: variable 'omit' from source: magic vars 8454 1726882404.83598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882404.83832: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882404.83882: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882404.83910: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882404.83942: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882404.84018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882404.84040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882404.84064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882404.84096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882404.84196: Evaluated conditional (not __network_is_ostree is defined): True 8454 1726882404.84202: variable 'omit' from source: magic vars 8454 1726882404.84237: variable 'omit' from source: magic vars 8454 1726882404.84267: variable 'omit' from source: magic vars 8454 1726882404.84294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882404.84318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882404.84336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882404.84352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882404.84361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882404.84391: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882404.84394: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882404.84398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882404.84482: Set connection var ansible_connection to ssh 8454 1726882404.84493: Set connection var ansible_shell_executable to /bin/sh 8454 1726882404.84500: Set connection var ansible_timeout to 10 8454 1726882404.84503: Set connection var ansible_shell_type to sh 8454 1726882404.84518: Set connection var ansible_pipelining to False 8454 1726882404.84521: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882404.84541: variable 'ansible_shell_executable' from source: unknown 8454 1726882404.84544: variable 'ansible_connection' from source: unknown 8454 1726882404.84548: variable 'ansible_module_compression' from source: unknown 8454 1726882404.84550: variable 'ansible_shell_type' from source: unknown 8454 1726882404.84555: variable 'ansible_shell_executable' from source: unknown 8454 1726882404.84559: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882404.84564: variable 'ansible_pipelining' from source: unknown 8454 1726882404.84566: variable 'ansible_timeout' from source: unknown 8454 1726882404.84572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882404.84839: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882404.84843: variable 'omit' from source: magic vars 8454 1726882404.84846: starting attempt loop 8454 1726882404.84848: running the handler 8454 1726882404.84851: _low_level_execute_command(): starting 8454 1726882404.84853: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882404.85432: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882404.85453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882404.85469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882404.85492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882404.85552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.85618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882404.85653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882404.85676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882404.85774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882404.88411: stdout chunk (state=3): >>>/root <<< 8454 1726882404.88572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882404.88640: stderr chunk (state=3): >>><<< 8454 1726882404.88643: stdout chunk (state=3): >>><<< 8454 1726882404.88675: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882404.88686: _low_level_execute_command(): starting 8454 1726882404.88689: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670 `" && echo ansible-tmp-1726882404.8866105-8591-195548034638670="` echo /root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670 `" ) && sleep 0' 8454 1726882404.89353: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882404.89384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882404.89401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882404.89421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882404.89572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882404.92506: stdout chunk (state=3): >>>ansible-tmp-1726882404.8866105-8591-195548034638670=/root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670 <<< 8454 1726882404.92725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882404.92729: stdout chunk (state=3): >>><<< 8454 1726882404.92741: stderr chunk (state=3): >>><<< 8454 1726882404.92753: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882404.8866105-8591-195548034638670=/root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882404.92797: variable 'ansible_module_compression' from source: unknown 8454 1726882404.92847: ANSIBALLZ: Using lock for stat 8454 1726882404.92851: ANSIBALLZ: Acquiring lock 8454 1726882404.92854: ANSIBALLZ: Lock acquired: 140055527345856 8454 1726882404.92856: ANSIBALLZ: Creating module 8454 1726882405.05021: ANSIBALLZ: Writing module into payload 8454 1726882405.05240: ANSIBALLZ: Writing module 8454 1726882405.05245: ANSIBALLZ: Renaming module 8454 1726882405.05248: ANSIBALLZ: Done creating module 8454 1726882405.05250: variable 'ansible_facts' from source: unknown 8454 1726882405.05297: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/AnsiballZ_stat.py 8454 1726882405.05563: Sending initial data 8454 1726882405.05597: Sent initial data (151 bytes) 8454 1726882405.06211: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882405.06241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882405.06280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882405.06410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882405.08974: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 8454 1726882405.08982: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882405.09087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882405.09201: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpcuerr1nh /root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/AnsiballZ_stat.py <<< 8454 1726882405.09210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/AnsiballZ_stat.py" <<< 8454 1726882405.09321: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpcuerr1nh" to remote "/root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/AnsiballZ_stat.py" <<< 8454 1726882405.10592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882405.10655: stderr chunk (state=3): >>><<< 8454 1726882405.10658: stdout chunk (state=3): >>><<< 8454 1726882405.10683: done transferring module to remote 8454 1726882405.10698: _low_level_execute_command(): starting 8454 1726882405.10702: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/ /root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/AnsiballZ_stat.py && sleep 0' 8454 1726882405.11184: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882405.11192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882405.11194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8454 1726882405.11197: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882405.11199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882405.11247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882405.11262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882405.11385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882405.14218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882405.14302: stderr chunk (state=3): >>><<< 8454 1726882405.14306: stdout chunk (state=3): >>><<< 8454 1726882405.14317: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882405.14320: _low_level_execute_command(): starting 8454 1726882405.14338: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/AnsiballZ_stat.py && sleep 0' 8454 1726882405.14996: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882405.15023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882405.15137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882405.18413: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 8454 1726882405.18458: stdout chunk (state=3): >>>import _imp # builtin <<< 8454 1726882405.18505: stdout chunk (state=3): >>>import '_thread' # <<< 8454 1726882405.18668: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 8454 1726882405.18706: stdout chunk (state=3): >>>import 'posix' # <<< 8454 1726882405.18765: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 8454 1726882405.18777: stdout chunk (state=3): >>># installing zipimport hook <<< 8454 1726882405.18817: stdout chunk (state=3): >>>import 'time' # <<< 8454 1726882405.18847: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 8454 1726882405.19178: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d40c530> <<< 8454 1726882405.19203: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d3dbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 8454 1726882405.19433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d40eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 8454 1726882405.19494: stdout chunk (state=3): >>>import '_collections_abc' # <<< 8454 1726882405.19542: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 8454 1726882405.19611: stdout chunk (state=3): >>>import 'os' # <<< 8454 1726882405.19732: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 8454 1726882405.19738: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 8454 1726882405.19794: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 8454 1726882405.19840: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d1e1160> <<< 8454 1726882405.19951: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 8454 1726882405.19986: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d1e1fd0> <<< 8454 1726882405.20052: stdout chunk (state=3): >>>import 'site' # <<< 8454 1726882405.20070: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux<<< 8454 1726882405.20277: stdout chunk (state=3): >>> Type "help", "copyright", "credits" or "license" for more information. <<< 8454 1726882405.20490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 8454 1726882405.20519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 8454 1726882405.20626: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 8454 1726882405.20629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8454 1726882405.20703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 8454 1726882405.20719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 8454 1726882405.20753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 8454 1726882405.20794: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21fe90> <<< 8454 1726882405.20824: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 8454 1726882405.20863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 8454 1726882405.20924: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21ff50> <<< 8454 1726882405.20952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 8454 1726882405.21007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 8454 1726882405.21043: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 8454 1726882405.21122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882405.21156: stdout chunk (state=3): >>>import 'itertools' # <<< 8454 1726882405.21201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 8454 1726882405.21204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 8454 1726882405.21242: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d257860> <<< 8454 1726882405.21430: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d257ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d237b30> import '_functools' # <<< 8454 1726882405.21462: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2351f0> <<< 8454 1726882405.21598: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21d040> <<< 8454 1726882405.21731: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 8454 1726882405.21822: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 8454 1726882405.21941: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d27b800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d27a420> <<< 8454 1726882405.21966: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2362a0> <<< 8454 1726882405.22118: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d278c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ac800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21c2c0> <<< 8454 1726882405.22120: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 8454 1726882405.22139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 8454 1726882405.22157: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.22175: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.22187: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2accb0> <<< 8454 1726882405.22202: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2acb60> <<< 8454 1726882405.22256: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.22279: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 8454 1726882405.22374: stdout chunk (state=3): >>> import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2acf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21ade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 8454 1726882405.22576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ad640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ad310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ae540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 8454 1726882405.22598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 8454 1726882405.22624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 8454 1726882405.22651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 8454 1726882405.22677: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2c4770> <<< 8454 1726882405.22679: stdout chunk (state=3): >>>import 'errno' # <<< 8454 1726882405.22722: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.22743: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.22747: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2c5eb0> <<< 8454 1726882405.22780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 8454 1726882405.22811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 8454 1726882405.22848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 8454 1726882405.22871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 8454 1726882405.22895: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2c6d80> <<< 8454 1726882405.22972: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.22999: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.23002: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2c73b0> <<< 8454 1726882405.23060: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2c62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 8454 1726882405.23116: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.23131: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2c7e30> <<< 8454 1726882405.23151: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2c7560> <<< 8454 1726882405.23235: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ae5a0> <<< 8454 1726882405.23259: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 8454 1726882405.23471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d05bcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d0847d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d084530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.23476: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.23495: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d0846e0> <<< 8454 1726882405.23526: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.23537: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d084920> <<< 8454 1726882405.23572: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d059e50> <<< 8454 1726882405.23606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 8454 1726882405.23780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8454 1726882405.23812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 8454 1726882405.23839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 8454 1726882405.23852: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d086030> <<< 8454 1726882405.23899: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d084cb0> <<< 8454 1726882405.23936: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2aec90> <<< 8454 1726882405.23989: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8454 1726882405.24065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882405.24110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8454 1726882405.24361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d0b63c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 8454 1726882405.24437: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d0ce510> <<< 8454 1726882405.24470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8454 1726882405.24561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 8454 1726882405.24648: stdout chunk (state=3): >>>import 'ntpath' # <<< 8454 1726882405.24705: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882405.24712: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d1072c0> <<< 8454 1726882405.24738: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 8454 1726882405.24816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 8454 1726882405.24843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 8454 1726882405.24928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 8454 1726882405.25093: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d12da60> <<< 8454 1726882405.25217: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d1073e0> <<< 8454 1726882405.25286: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d0cf1a0> <<< 8454 1726882405.25326: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 8454 1726882405.25341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 8454 1726882405.25344: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf48380> <<< 8454 1726882405.25379: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d0cd550> <<< 8454 1726882405.25425: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d086f60> <<< 8454 1726882405.25767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f823cf48560> # zipimport: found 30 names in '/tmp/ansible_stat_payload_wqkln4yk/ansible_stat_payload.zip' # zipimport: zlib available <<< 8454 1726882405.25973: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.26007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 8454 1726882405.26038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8454 1726882405.26105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8454 1726882405.26246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8454 1726882405.26292: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 8454 1726882405.26314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf9e060> <<< 8454 1726882405.26325: stdout chunk (state=3): >>>import '_typing' # <<< 8454 1726882405.26657: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf74f50> <<< 8454 1726882405.26676: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf740b0> <<< 8454 1726882405.26701: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.26730: stdout chunk (state=3): >>>import 'ansible' # <<< 8454 1726882405.26760: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.26784: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.26822: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.26838: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 8454 1726882405.26894: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.29466: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.31680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 8454 1726882405.31698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 8454 1726882405.31717: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf77ef0> <<< 8454 1726882405.32066: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 8454 1726882405.32072: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cfc59d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cfc5760> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cfc5070> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 8454 1726882405.32099: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cfc54c0> <<< 8454 1726882405.32117: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf9ecf0> <<< 8454 1726882405.32139: stdout chunk (state=3): >>>import 'atexit' # <<< 8454 1726882405.32185: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.32201: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.32216: stdout chunk (state=3): >>>import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cfc6720> <<< 8454 1726882405.32256: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.32275: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.32286: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cfc68a0> <<< 8454 1726882405.32328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 8454 1726882405.32421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 8454 1726882405.32460: stdout chunk (state=3): >>>import '_locale' # <<< 8454 1726882405.32528: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cfc6db0> <<< 8454 1726882405.32558: stdout chunk (state=3): >>>import 'pwd' # <<< 8454 1726882405.32588: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 8454 1726882405.32635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 8454 1726882405.32695: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce28c20> <<< 8454 1726882405.32739: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.32863: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce2a840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2b200> <<< 8454 1726882405.32896: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 8454 1726882405.32936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 8454 1726882405.32972: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2c3e0> <<< 8454 1726882405.33010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 8454 1726882405.33081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 8454 1726882405.33114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 8454 1726882405.33131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 8454 1726882405.33241: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2ee10> <<< 8454 1726882405.33295: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.33462: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce2ef00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2d0a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 8454 1726882405.33478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 8454 1726882405.33504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 8454 1726882405.33528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 8454 1726882405.33543: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce32d50> <<< 8454 1726882405.33567: stdout chunk (state=3): >>>import '_tokenize' # <<< 8454 1726882405.33679: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce31820> <<< 8454 1726882405.33695: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce31580> <<< 8454 1726882405.33720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 8454 1726882405.33743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 8454 1726882405.33882: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce33bc0> <<< 8454 1726882405.33930: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2d5e0> <<< 8454 1726882405.33968: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.33986: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce7aea0> <<< 8454 1726882405.34028: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 8454 1726882405.34042: stdout chunk (state=3): >>> <<< 8454 1726882405.34048: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7b050> <<< 8454 1726882405.34095: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 8454 1726882405.34120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 8454 1726882405.34365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce7cbf0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7c9b0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 8454 1726882405.34422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 8454 1726882405.34502: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.34518: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.34529: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce7f110> <<< 8454 1726882405.34536: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7d2b0> <<< 8454 1726882405.34574: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 8454 1726882405.34655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882405.34699: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 8454 1726882405.34719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 8454 1726882405.34746: stdout chunk (state=3): >>>import '_string' # <<< 8454 1726882405.34830: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce868d0> <<< 8454 1726882405.35109: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7f260> <<< 8454 1726882405.35236: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.35244: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.35256: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce87740> <<< 8454 1726882405.35303: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.35317: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.35325: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce87590> <<< 8454 1726882405.35419: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.35464: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce87b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7b350> <<< 8454 1726882405.35491: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 8454 1726882405.35508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 8454 1726882405.35658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce8b320> <<< 8454 1726882405.35963: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.35984: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.36000: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce8c8f0> <<< 8454 1726882405.36020: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce89a90> <<< 8454 1726882405.36072: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.36087: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.36100: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce8ae40> <<< 8454 1726882405.36109: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce896d0> <<< 8454 1726882405.36135: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36163: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36185: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 8454 1726882405.36213: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36563: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36569: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36602: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36612: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 8454 1726882405.36642: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36669: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36693: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 8454 1726882405.36716: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.36970: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.37213: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.38409: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.39617: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 8454 1726882405.39637: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 8454 1726882405.39659: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 8454 1726882405.39673: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 8454 1726882405.39706: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 8454 1726882405.39747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882405.39820: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.40064: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cf10a70> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf11820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce8f3e0> <<< 8454 1726882405.40116: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 8454 1726882405.40147: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.40182: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.40213: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 8454 1726882405.40241: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.40551: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.40868: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 8454 1726882405.40899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 8454 1726882405.40910: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf11520> <<< 8454 1726882405.40937: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.41902: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.42873: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.43020: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.43171: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 8454 1726882405.43202: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.43282: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.43341: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 8454 1726882405.43373: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.43519: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.43869: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882405.43875: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 8454 1726882405.43904: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.44396: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.44961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 8454 1726882405.45002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 8454 1726882405.45039: stdout chunk (state=3): >>>import '_ast' # <<< 8454 1726882405.45196: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf125d0> <<< 8454 1726882405.45225: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.45373: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.45571: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 8454 1726882405.45579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 8454 1726882405.45602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 8454 1726882405.45731: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.45940: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cf1e3c0><<< 8454 1726882405.46164: stdout chunk (state=3): >>> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cf1ecf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf13290> # zipimport: zlib available # zipimport: zlib available <<< 8454 1726882405.46220: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 8454 1726882405.46244: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.46332: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.46417: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.46532: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.46660: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 8454 1726882405.46746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882405.46887: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.46917: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 8454 1726882405.46920: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cf1d970> <<< 8454 1726882405.47004: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf1ef30> <<< 8454 1726882405.47047: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 8454 1726882405.47057: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 8454 1726882405.47076: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.47203: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.47322: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.47374: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.47446: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 8454 1726882405.47461: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 8454 1726882405.47559: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 8454 1726882405.47645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 8454 1726882405.47674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 8454 1726882405.47707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 8454 1726882405.47816: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cdaefc0> <<< 8454 1726882405.47900: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cd2be30> <<< 8454 1726882405.48017: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cd23560> <<< 8454 1726882405.48063: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cd22c90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 8454 1726882405.48105: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.48146: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 8454 1726882405.48359: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 8454 1726882405.48550: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.48895: stdout chunk (state=3): >>># zipimport: zlib available <<< 8454 1726882405.49125: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8454 1726882405.49151: stdout chunk (state=3): >>># destroy __main__ <<< 8454 1726882405.49651: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8454 1726882405.49664: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._<<< 8454 1726882405.49700: stdout chunk (state=3): >>> <<< 8454 1726882405.49706: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 <<< 8454 1726882405.49712: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type<<< 8454 1726882405.49737: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback <<< 8454 1726882405.49756: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path <<< 8454 1726882405.49765: stdout chunk (state=3): >>># restore sys.stdin<<< 8454 1726882405.49777: stdout chunk (state=3): >>> <<< 8454 1726882405.49791: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr <<< 8454 1726882405.49816: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins <<< 8454 1726882405.49820: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings<<< 8454 1726882405.49988: stdout chunk (state=3): >>> # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket <<< 8454 1726882405.50001: stdout chunk (state=3): >>># cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat <<< 8454 1726882405.50033: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 8454 1726882405.50044: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves<<< 8454 1726882405.50096: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters <<< 8454 1726882405.50107: stdout chunk (state=3): >>># cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes<<< 8454 1726882405.50324: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy<<< 8454 1726882405.50327: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 8454 1726882405.50521: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 8454 1726882405.50544: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 8454 1726882405.50569: stdout chunk (state=3): >>># destroy _bz2 <<< 8454 1726882405.50588: stdout chunk (state=3): >>># destroy _compression <<< 8454 1726882405.50597: stdout chunk (state=3): >>># destroy _lzma <<< 8454 1726882405.50618: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib <<< 8454 1726882405.50646: stdout chunk (state=3): >>># destroy bz2 # destroy lzma <<< 8454 1726882405.50648: stdout chunk (state=3): >>># destroy zipfile._path <<< 8454 1726882405.50683: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 8454 1726882405.50699: stdout chunk (state=3): >>># destroy fnmatch # destroy ipaddress <<< 8454 1726882405.50741: stdout chunk (state=3): >>># destroy ntpath <<< 8454 1726882405.50968: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 8454 1726882405.50974: stdout chunk (state=3): >>># cleanup[3] wiping tokenize <<< 8454 1726882405.51003: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 8454 1726882405.51018: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 8454 1726882405.51040: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 8454 1726882405.51057: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 <<< 8454 1726882405.51074: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 8454 1726882405.51084: stdout chunk (state=3): >>># cleanup[3] wiping warnings <<< 8454 1726882405.51105: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 8454 1726882405.51116: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap <<< 8454 1726882405.51128: stdout chunk (state=3): >>># cleanup[3] wiping _struct <<< 8454 1726882405.51150: stdout chunk (state=3): >>># cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 8454 1726882405.51157: stdout chunk (state=3): >>># destroy enum <<< 8454 1726882405.51170: stdout chunk (state=3): >>># cleanup[3] wiping copyreg <<< 8454 1726882405.51188: stdout chunk (state=3): >>># cleanup[3] wiping re._parser <<< 8454 1726882405.51192: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools<<< 8454 1726882405.51217: stdout chunk (state=3): >>> # cleanup[3] wiping _functools # cleanup[3] wiping collections <<< 8454 1726882405.51236: stdout chunk (state=3): >>># destroy _collections_abc <<< 8454 1726882405.51259: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections <<< 8454 1726882405.51271: stdout chunk (state=3): >>># cleanup[3] wiping itertools <<< 8454 1726882405.51277: stdout chunk (state=3): >>># cleanup[3] wiping operator <<< 8454 1726882405.51363: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8454 1726882405.51542: stdout chunk (state=3): >>># destroy sys.monitoring <<< 8454 1726882405.51548: stdout chunk (state=3): >>># destroy _socket <<< 8454 1726882405.51578: stdout chunk (state=3): >>># destroy _collections <<< 8454 1726882405.51603: stdout chunk (state=3): >>># destroy platform <<< 8454 1726882405.51622: stdout chunk (state=3): >>># destroy _uuid <<< 8454 1726882405.51641: stdout chunk (state=3): >>># destroy stat # destroy genericpath <<< 8454 1726882405.51649: stdout chunk (state=3): >>># destroy re._parser <<< 8454 1726882405.51764: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 8454 1726882405.51790: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 8454 1726882405.51923: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 8454 1726882405.51938: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 8454 1726882405.51954: stdout chunk (state=3): >>># destroy _codecs <<< 8454 1726882405.51958: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 8454 1726882405.51982: stdout chunk (state=3): >>># destroy atexit <<< 8454 1726882405.51995: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect <<< 8454 1726882405.52004: stdout chunk (state=3): >>># destroy time <<< 8454 1726882405.52035: stdout chunk (state=3): >>># destroy _random<<< 8454 1726882405.52261: stdout chunk (state=3): >>> # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 8454 1726882405.52631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882405.52698: stderr chunk (state=3): >>><<< 8454 1726882405.52701: stdout chunk (state=3): >>><<< 8454 1726882405.52772: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d40c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d3dbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d40eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d1e1160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d1e1fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d257860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d257ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d237b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2351f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d27b800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d27a420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2362a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d278c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ac800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21c2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2accb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2acb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2acf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d21ade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ad640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ad310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ae540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2c4770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2c5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2c6d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2c73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2c62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d2c7e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2c7560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2ae5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d05bcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d0847d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d084530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d0846e0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823d084920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d059e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d086030> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d084cb0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d2aec90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d0b63c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d0ce510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d1072c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d12da60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d1073e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d0cf1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf48380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d0cd550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823d086f60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f823cf48560> # zipimport: found 30 names in '/tmp/ansible_stat_payload_wqkln4yk/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf9e060> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf74f50> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf740b0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf77ef0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cfc59d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cfc5760> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cfc5070> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cfc54c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf9ecf0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cfc6720> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cfc68a0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cfc6db0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce28c20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce2a840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2b200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2c3e0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2ee10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce2ef00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2d0a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce32d50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce31820> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce31580> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce33bc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce2d5e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce7aea0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7b050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce7cbf0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7c9b0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce7f110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7d2b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce868d0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7f260> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce87740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce87590> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce87b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce7b350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce8b320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce8c8f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce89a90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823ce8ae40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce896d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cf10a70> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf11820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823ce8f3e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf11520> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf125d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cf1e3c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cf1ecf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf13290> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f823cf1d970> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cf1ef30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cdaefc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cd2be30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cd23560> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f823cd22c90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 8454 1726882405.53356: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882405.53359: _low_level_execute_command(): starting 8454 1726882405.53362: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882404.8866105-8591-195548034638670/ > /dev/null 2>&1 && sleep 0' 8454 1726882405.53510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882405.53514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882405.53517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882405.53572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882405.53576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882405.53706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882405.56514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882405.56562: stderr chunk (state=3): >>><<< 8454 1726882405.56566: stdout chunk (state=3): >>><<< 8454 1726882405.56582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882405.56586: handler run complete 8454 1726882405.56611: attempt loop complete, returning result 8454 1726882405.56614: _execute() done 8454 1726882405.56617: dumping result to json 8454 1726882405.56625: done dumping result, returning 8454 1726882405.56637: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0affe814-3a2d-f59f-16b9-0000000000df] 8454 1726882405.56642: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000df 8454 1726882405.56739: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000df 8454 1726882405.56743: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8454 1726882405.56817: no more pending results, returning what we have 8454 1726882405.56820: results queue empty 8454 1726882405.56821: checking for any_errors_fatal 8454 1726882405.56829: done checking for any_errors_fatal 8454 1726882405.56830: checking for max_fail_percentage 8454 1726882405.56832: done checking for max_fail_percentage 8454 1726882405.56833: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.56836: done checking to see if all hosts have failed 8454 1726882405.56836: getting the remaining hosts for this loop 8454 1726882405.56838: done getting the remaining hosts for this loop 8454 1726882405.56842: getting the next task for host managed_node3 8454 1726882405.56848: done getting next task for host managed_node3 8454 1726882405.56852: ^ task is: TASK: Set flag to indicate system is ostree 8454 1726882405.56855: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.56860: getting variables 8454 1726882405.56862: in VariableManager get_vars() 8454 1726882405.56898: Calling all_inventory to load vars for managed_node3 8454 1726882405.56901: Calling groups_inventory to load vars for managed_node3 8454 1726882405.56905: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.56916: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.56919: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.56923: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.57104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.57287: done with get_vars() 8454 1726882405.57297: done getting variables 8454 1726882405.57376: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:33:25 -0400 (0:00:00.749) 0:00:03.591 ****** 8454 1726882405.57402: entering _queue_task() for managed_node3/set_fact 8454 1726882405.57404: Creating lock for set_fact 8454 1726882405.57617: worker is 1 (out of 1 available) 8454 1726882405.57630: exiting _queue_task() for managed_node3/set_fact 8454 1726882405.57644: done queuing things up, now waiting for results queue to drain 8454 1726882405.57645: waiting for pending results... 8454 1726882405.57800: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 8454 1726882405.57878: in run() - task 0affe814-3a2d-f59f-16b9-0000000000e0 8454 1726882405.57894: variable 'ansible_search_path' from source: unknown 8454 1726882405.57897: variable 'ansible_search_path' from source: unknown 8454 1726882405.57929: calling self._execute() 8454 1726882405.57997: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.58005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.58015: variable 'omit' from source: magic vars 8454 1726882405.58415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882405.58617: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882405.58661: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882405.58693: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882405.58724: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882405.58816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882405.58837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882405.58862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882405.58890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882405.58994: Evaluated conditional (not __network_is_ostree is defined): True 8454 1726882405.59000: variable 'omit' from source: magic vars 8454 1726882405.59031: variable 'omit' from source: magic vars 8454 1726882405.59133: variable '__ostree_booted_stat' from source: set_fact 8454 1726882405.59173: variable 'omit' from source: magic vars 8454 1726882405.59201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882405.59225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882405.59242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882405.59258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882405.59267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882405.59299: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882405.59304: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.59307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.59390: Set connection var ansible_connection to ssh 8454 1726882405.59400: Set connection var ansible_shell_executable to /bin/sh 8454 1726882405.59406: Set connection var ansible_timeout to 10 8454 1726882405.59409: Set connection var ansible_shell_type to sh 8454 1726882405.59425: Set connection var ansible_pipelining to False 8454 1726882405.59428: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882405.59447: variable 'ansible_shell_executable' from source: unknown 8454 1726882405.59451: variable 'ansible_connection' from source: unknown 8454 1726882405.59453: variable 'ansible_module_compression' from source: unknown 8454 1726882405.59456: variable 'ansible_shell_type' from source: unknown 8454 1726882405.59461: variable 'ansible_shell_executable' from source: unknown 8454 1726882405.59464: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.59469: variable 'ansible_pipelining' from source: unknown 8454 1726882405.59472: variable 'ansible_timeout' from source: unknown 8454 1726882405.59478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.59565: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882405.59574: variable 'omit' from source: magic vars 8454 1726882405.59580: starting attempt loop 8454 1726882405.59586: running the handler 8454 1726882405.59597: handler run complete 8454 1726882405.59606: attempt loop complete, returning result 8454 1726882405.59609: _execute() done 8454 1726882405.59611: dumping result to json 8454 1726882405.59617: done dumping result, returning 8454 1726882405.59624: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0affe814-3a2d-f59f-16b9-0000000000e0] 8454 1726882405.59634: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000e0 8454 1726882405.59714: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000e0 8454 1726882405.59717: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 8454 1726882405.59792: no more pending results, returning what we have 8454 1726882405.59795: results queue empty 8454 1726882405.59796: checking for any_errors_fatal 8454 1726882405.59802: done checking for any_errors_fatal 8454 1726882405.59803: checking for max_fail_percentage 8454 1726882405.59805: done checking for max_fail_percentage 8454 1726882405.59806: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.59806: done checking to see if all hosts have failed 8454 1726882405.59807: getting the remaining hosts for this loop 8454 1726882405.59809: done getting the remaining hosts for this loop 8454 1726882405.59812: getting the next task for host managed_node3 8454 1726882405.59820: done getting next task for host managed_node3 8454 1726882405.59823: ^ task is: TASK: Fix CentOS6 Base repo 8454 1726882405.59826: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.59829: getting variables 8454 1726882405.59831: in VariableManager get_vars() 8454 1726882405.59859: Calling all_inventory to load vars for managed_node3 8454 1726882405.59862: Calling groups_inventory to load vars for managed_node3 8454 1726882405.59865: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.59875: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.59877: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.59885: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.60014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.60172: done with get_vars() 8454 1726882405.60180: done getting variables 8454 1726882405.60276: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:33:25 -0400 (0:00:00.028) 0:00:03.620 ****** 8454 1726882405.60298: entering _queue_task() for managed_node3/copy 8454 1726882405.60484: worker is 1 (out of 1 available) 8454 1726882405.60496: exiting _queue_task() for managed_node3/copy 8454 1726882405.60509: done queuing things up, now waiting for results queue to drain 8454 1726882405.60510: waiting for pending results... 8454 1726882405.60663: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 8454 1726882405.60727: in run() - task 0affe814-3a2d-f59f-16b9-0000000000e2 8454 1726882405.60744: variable 'ansible_search_path' from source: unknown 8454 1726882405.60748: variable 'ansible_search_path' from source: unknown 8454 1726882405.60778: calling self._execute() 8454 1726882405.60837: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.60844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.60854: variable 'omit' from source: magic vars 8454 1726882405.61276: variable 'ansible_distribution' from source: facts 8454 1726882405.61298: Evaluated conditional (ansible_distribution == 'CentOS'): False 8454 1726882405.61301: when evaluation is False, skipping this task 8454 1726882405.61304: _execute() done 8454 1726882405.61306: dumping result to json 8454 1726882405.61317: done dumping result, returning 8454 1726882405.61320: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0affe814-3a2d-f59f-16b9-0000000000e2] 8454 1726882405.61323: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000e2 8454 1726882405.61419: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000e2 8454 1726882405.61423: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 8454 1726882405.61483: no more pending results, returning what we have 8454 1726882405.61486: results queue empty 8454 1726882405.61487: checking for any_errors_fatal 8454 1726882405.61490: done checking for any_errors_fatal 8454 1726882405.61491: checking for max_fail_percentage 8454 1726882405.61493: done checking for max_fail_percentage 8454 1726882405.61494: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.61495: done checking to see if all hosts have failed 8454 1726882405.61496: getting the remaining hosts for this loop 8454 1726882405.61497: done getting the remaining hosts for this loop 8454 1726882405.61500: getting the next task for host managed_node3 8454 1726882405.61506: done getting next task for host managed_node3 8454 1726882405.61509: ^ task is: TASK: Include the task 'enable_epel.yml' 8454 1726882405.61512: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.61515: getting variables 8454 1726882405.61516: in VariableManager get_vars() 8454 1726882405.61542: Calling all_inventory to load vars for managed_node3 8454 1726882405.61545: Calling groups_inventory to load vars for managed_node3 8454 1726882405.61547: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.61554: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.61556: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.61558: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.61712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.61867: done with get_vars() 8454 1726882405.61874: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:33:25 -0400 (0:00:00.016) 0:00:03.637 ****** 8454 1726882405.61942: entering _queue_task() for managed_node3/include_tasks 8454 1726882405.62118: worker is 1 (out of 1 available) 8454 1726882405.62131: exiting _queue_task() for managed_node3/include_tasks 8454 1726882405.62143: done queuing things up, now waiting for results queue to drain 8454 1726882405.62145: waiting for pending results... 8454 1726882405.62279: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 8454 1726882405.62347: in run() - task 0affe814-3a2d-f59f-16b9-0000000000e3 8454 1726882405.62359: variable 'ansible_search_path' from source: unknown 8454 1726882405.62362: variable 'ansible_search_path' from source: unknown 8454 1726882405.62400: calling self._execute() 8454 1726882405.62448: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.62454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.62464: variable 'omit' from source: magic vars 8454 1726882405.62848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882405.64533: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882405.64592: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882405.64622: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882405.64670: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882405.64698: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882405.64762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882405.64791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882405.64812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882405.64847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882405.64860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882405.64956: variable '__network_is_ostree' from source: set_fact 8454 1726882405.64972: Evaluated conditional (not __network_is_ostree | d(false)): True 8454 1726882405.64975: _execute() done 8454 1726882405.64978: dumping result to json 8454 1726882405.64985: done dumping result, returning 8454 1726882405.64993: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0affe814-3a2d-f59f-16b9-0000000000e3] 8454 1726882405.64996: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000e3 8454 1726882405.65088: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000e3 8454 1726882405.65091: WORKER PROCESS EXITING 8454 1726882405.65122: no more pending results, returning what we have 8454 1726882405.65127: in VariableManager get_vars() 8454 1726882405.65159: Calling all_inventory to load vars for managed_node3 8454 1726882405.65163: Calling groups_inventory to load vars for managed_node3 8454 1726882405.65166: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.65176: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.65182: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.65186: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.65354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.65509: done with get_vars() 8454 1726882405.65516: variable 'ansible_search_path' from source: unknown 8454 1726882405.65517: variable 'ansible_search_path' from source: unknown 8454 1726882405.65547: we have included files to process 8454 1726882405.65548: generating all_blocks data 8454 1726882405.65549: done generating all_blocks data 8454 1726882405.65554: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8454 1726882405.65555: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8454 1726882405.65557: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8454 1726882405.66122: done processing included file 8454 1726882405.66124: iterating over new_blocks loaded from include file 8454 1726882405.66125: in VariableManager get_vars() 8454 1726882405.66135: done with get_vars() 8454 1726882405.66137: filtering new block on tags 8454 1726882405.66157: done filtering new block on tags 8454 1726882405.66160: in VariableManager get_vars() 8454 1726882405.66170: done with get_vars() 8454 1726882405.66171: filtering new block on tags 8454 1726882405.66181: done filtering new block on tags 8454 1726882405.66183: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 8454 1726882405.66187: extending task lists for all hosts with included blocks 8454 1726882405.66267: done extending task lists 8454 1726882405.66269: done processing included files 8454 1726882405.66270: results queue empty 8454 1726882405.66270: checking for any_errors_fatal 8454 1726882405.66273: done checking for any_errors_fatal 8454 1726882405.66274: checking for max_fail_percentage 8454 1726882405.66275: done checking for max_fail_percentage 8454 1726882405.66276: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.66277: done checking to see if all hosts have failed 8454 1726882405.66277: getting the remaining hosts for this loop 8454 1726882405.66278: done getting the remaining hosts for this loop 8454 1726882405.66281: getting the next task for host managed_node3 8454 1726882405.66284: done getting next task for host managed_node3 8454 1726882405.66286: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 8454 1726882405.66288: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.66290: getting variables 8454 1726882405.66291: in VariableManager get_vars() 8454 1726882405.66297: Calling all_inventory to load vars for managed_node3 8454 1726882405.66298: Calling groups_inventory to load vars for managed_node3 8454 1726882405.66300: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.66304: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.66309: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.66312: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.66432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.66585: done with get_vars() 8454 1726882405.66593: done getting variables 8454 1726882405.66648: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 8454 1726882405.66801: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:33:25 -0400 (0:00:00.048) 0:00:03.686 ****** 8454 1726882405.66841: entering _queue_task() for managed_node3/command 8454 1726882405.66843: Creating lock for command 8454 1726882405.67018: worker is 1 (out of 1 available) 8454 1726882405.67029: exiting _queue_task() for managed_node3/command 8454 1726882405.67042: done queuing things up, now waiting for results queue to drain 8454 1726882405.67043: waiting for pending results... 8454 1726882405.67195: running TaskExecutor() for managed_node3/TASK: Create EPEL 39 8454 1726882405.67320: in run() - task 0affe814-3a2d-f59f-16b9-0000000000fd 8454 1726882405.67331: variable 'ansible_search_path' from source: unknown 8454 1726882405.67336: variable 'ansible_search_path' from source: unknown 8454 1726882405.67369: calling self._execute() 8454 1726882405.67437: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.67444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.67454: variable 'omit' from source: magic vars 8454 1726882405.67758: variable 'ansible_distribution' from source: facts 8454 1726882405.67767: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8454 1726882405.67771: when evaluation is False, skipping this task 8454 1726882405.67774: _execute() done 8454 1726882405.67779: dumping result to json 8454 1726882405.67786: done dumping result, returning 8454 1726882405.67793: done running TaskExecutor() for managed_node3/TASK: Create EPEL 39 [0affe814-3a2d-f59f-16b9-0000000000fd] 8454 1726882405.67798: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000fd 8454 1726882405.67943: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000fd 8454 1726882405.67951: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8454 1726882405.68006: no more pending results, returning what we have 8454 1726882405.68010: results queue empty 8454 1726882405.68011: checking for any_errors_fatal 8454 1726882405.68012: done checking for any_errors_fatal 8454 1726882405.68013: checking for max_fail_percentage 8454 1726882405.68016: done checking for max_fail_percentage 8454 1726882405.68017: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.68018: done checking to see if all hosts have failed 8454 1726882405.68019: getting the remaining hosts for this loop 8454 1726882405.68020: done getting the remaining hosts for this loop 8454 1726882405.68024: getting the next task for host managed_node3 8454 1726882405.68031: done getting next task for host managed_node3 8454 1726882405.68035: ^ task is: TASK: Install yum-utils package 8454 1726882405.68044: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.68050: getting variables 8454 1726882405.68052: in VariableManager get_vars() 8454 1726882405.68089: Calling all_inventory to load vars for managed_node3 8454 1726882405.68096: Calling groups_inventory to load vars for managed_node3 8454 1726882405.68100: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.68111: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.68117: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.68122: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.68378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.68558: done with get_vars() 8454 1726882405.68567: done getting variables 8454 1726882405.68640: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:33:25 -0400 (0:00:00.018) 0:00:03.704 ****** 8454 1726882405.68664: entering _queue_task() for managed_node3/package 8454 1726882405.68665: Creating lock for package 8454 1726882405.68916: worker is 1 (out of 1 available) 8454 1726882405.68927: exiting _queue_task() for managed_node3/package 8454 1726882405.68944: done queuing things up, now waiting for results queue to drain 8454 1726882405.68946: waiting for pending results... 8454 1726882405.69464: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 8454 1726882405.69469: in run() - task 0affe814-3a2d-f59f-16b9-0000000000fe 8454 1726882405.69473: variable 'ansible_search_path' from source: unknown 8454 1726882405.69475: variable 'ansible_search_path' from source: unknown 8454 1726882405.69478: calling self._execute() 8454 1726882405.69513: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.69519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.69523: variable 'omit' from source: magic vars 8454 1726882405.69898: variable 'ansible_distribution' from source: facts 8454 1726882405.69902: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8454 1726882405.69909: when evaluation is False, skipping this task 8454 1726882405.69915: _execute() done 8454 1726882405.69918: dumping result to json 8454 1726882405.69921: done dumping result, returning 8454 1726882405.69969: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0affe814-3a2d-f59f-16b9-0000000000fe] 8454 1726882405.69975: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000fe 8454 1726882405.70055: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000fe 8454 1726882405.70062: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8454 1726882405.70124: no more pending results, returning what we have 8454 1726882405.70126: results queue empty 8454 1726882405.70127: checking for any_errors_fatal 8454 1726882405.70131: done checking for any_errors_fatal 8454 1726882405.70132: checking for max_fail_percentage 8454 1726882405.70133: done checking for max_fail_percentage 8454 1726882405.70136: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.70136: done checking to see if all hosts have failed 8454 1726882405.70137: getting the remaining hosts for this loop 8454 1726882405.70138: done getting the remaining hosts for this loop 8454 1726882405.70141: getting the next task for host managed_node3 8454 1726882405.70145: done getting next task for host managed_node3 8454 1726882405.70146: ^ task is: TASK: Enable EPEL 7 8454 1726882405.70149: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.70152: getting variables 8454 1726882405.70152: in VariableManager get_vars() 8454 1726882405.70176: Calling all_inventory to load vars for managed_node3 8454 1726882405.70178: Calling groups_inventory to load vars for managed_node3 8454 1726882405.70183: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.70190: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.70192: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.70194: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.70328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.70572: done with get_vars() 8454 1726882405.70584: done getting variables 8454 1726882405.70643: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:33:25 -0400 (0:00:00.020) 0:00:03.724 ****** 8454 1726882405.70677: entering _queue_task() for managed_node3/command 8454 1726882405.70952: worker is 1 (out of 1 available) 8454 1726882405.70982: exiting _queue_task() for managed_node3/command 8454 1726882405.71007: done queuing things up, now waiting for results queue to drain 8454 1726882405.71009: waiting for pending results... 8454 1726882405.71452: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 8454 1726882405.71456: in run() - task 0affe814-3a2d-f59f-16b9-0000000000ff 8454 1726882405.71459: variable 'ansible_search_path' from source: unknown 8454 1726882405.71461: variable 'ansible_search_path' from source: unknown 8454 1726882405.71468: calling self._execute() 8454 1726882405.71554: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.71568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.71590: variable 'omit' from source: magic vars 8454 1726882405.72019: variable 'ansible_distribution' from source: facts 8454 1726882405.72039: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8454 1726882405.72048: when evaluation is False, skipping this task 8454 1726882405.72055: _execute() done 8454 1726882405.72062: dumping result to json 8454 1726882405.72070: done dumping result, returning 8454 1726882405.72083: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0affe814-3a2d-f59f-16b9-0000000000ff] 8454 1726882405.72096: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000ff skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8454 1726882405.72282: no more pending results, returning what we have 8454 1726882405.72286: results queue empty 8454 1726882405.72287: checking for any_errors_fatal 8454 1726882405.72293: done checking for any_errors_fatal 8454 1726882405.72295: checking for max_fail_percentage 8454 1726882405.72296: done checking for max_fail_percentage 8454 1726882405.72298: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.72299: done checking to see if all hosts have failed 8454 1726882405.72300: getting the remaining hosts for this loop 8454 1726882405.72302: done getting the remaining hosts for this loop 8454 1726882405.72307: getting the next task for host managed_node3 8454 1726882405.72314: done getting next task for host managed_node3 8454 1726882405.72317: ^ task is: TASK: Enable EPEL 8 8454 1726882405.72322: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.72327: getting variables 8454 1726882405.72329: in VariableManager get_vars() 8454 1726882405.72363: Calling all_inventory to load vars for managed_node3 8454 1726882405.72367: Calling groups_inventory to load vars for managed_node3 8454 1726882405.72371: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.72388: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.72393: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.72398: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.72792: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000ff 8454 1726882405.72796: WORKER PROCESS EXITING 8454 1726882405.72822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.73097: done with get_vars() 8454 1726882405.73108: done getting variables 8454 1726882405.73168: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:33:25 -0400 (0:00:00.025) 0:00:03.749 ****** 8454 1726882405.73203: entering _queue_task() for managed_node3/command 8454 1726882405.73388: worker is 1 (out of 1 available) 8454 1726882405.73402: exiting _queue_task() for managed_node3/command 8454 1726882405.73412: done queuing things up, now waiting for results queue to drain 8454 1726882405.73414: waiting for pending results... 8454 1726882405.73560: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 8454 1726882405.73632: in run() - task 0affe814-3a2d-f59f-16b9-000000000100 8454 1726882405.73649: variable 'ansible_search_path' from source: unknown 8454 1726882405.73652: variable 'ansible_search_path' from source: unknown 8454 1726882405.73684: calling self._execute() 8454 1726882405.73746: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.73751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.73764: variable 'omit' from source: magic vars 8454 1726882405.74056: variable 'ansible_distribution' from source: facts 8454 1726882405.74066: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8454 1726882405.74070: when evaluation is False, skipping this task 8454 1726882405.74073: _execute() done 8454 1726882405.74078: dumping result to json 8454 1726882405.74084: done dumping result, returning 8454 1726882405.74097: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0affe814-3a2d-f59f-16b9-000000000100] 8454 1726882405.74100: sending task result for task 0affe814-3a2d-f59f-16b9-000000000100 8454 1726882405.74186: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000100 8454 1726882405.74189: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8454 1726882405.74249: no more pending results, returning what we have 8454 1726882405.74253: results queue empty 8454 1726882405.74253: checking for any_errors_fatal 8454 1726882405.74257: done checking for any_errors_fatal 8454 1726882405.74258: checking for max_fail_percentage 8454 1726882405.74260: done checking for max_fail_percentage 8454 1726882405.74261: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.74262: done checking to see if all hosts have failed 8454 1726882405.74263: getting the remaining hosts for this loop 8454 1726882405.74264: done getting the remaining hosts for this loop 8454 1726882405.74268: getting the next task for host managed_node3 8454 1726882405.74276: done getting next task for host managed_node3 8454 1726882405.74281: ^ task is: TASK: Enable EPEL 6 8454 1726882405.74285: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.74288: getting variables 8454 1726882405.74288: in VariableManager get_vars() 8454 1726882405.74308: Calling all_inventory to load vars for managed_node3 8454 1726882405.74310: Calling groups_inventory to load vars for managed_node3 8454 1726882405.74312: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.74319: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.74321: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.74323: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.74460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.74615: done with get_vars() 8454 1726882405.74622: done getting variables 8454 1726882405.74665: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:33:25 -0400 (0:00:00.014) 0:00:03.764 ****** 8454 1726882405.74688: entering _queue_task() for managed_node3/copy 8454 1726882405.74850: worker is 1 (out of 1 available) 8454 1726882405.74861: exiting _queue_task() for managed_node3/copy 8454 1726882405.74871: done queuing things up, now waiting for results queue to drain 8454 1726882405.74873: waiting for pending results... 8454 1726882405.75027: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 8454 1726882405.75101: in run() - task 0affe814-3a2d-f59f-16b9-000000000102 8454 1726882405.75112: variable 'ansible_search_path' from source: unknown 8454 1726882405.75116: variable 'ansible_search_path' from source: unknown 8454 1726882405.75245: calling self._execute() 8454 1726882405.75249: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.75252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.75270: variable 'omit' from source: magic vars 8454 1726882405.75675: variable 'ansible_distribution' from source: facts 8454 1726882405.75679: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8454 1726882405.75682: when evaluation is False, skipping this task 8454 1726882405.75687: _execute() done 8454 1726882405.75693: dumping result to json 8454 1726882405.75698: done dumping result, returning 8454 1726882405.75778: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0affe814-3a2d-f59f-16b9-000000000102] 8454 1726882405.75783: sending task result for task 0affe814-3a2d-f59f-16b9-000000000102 8454 1726882405.75853: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000102 8454 1726882405.75857: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8454 1726882405.75922: no more pending results, returning what we have 8454 1726882405.75925: results queue empty 8454 1726882405.75926: checking for any_errors_fatal 8454 1726882405.75930: done checking for any_errors_fatal 8454 1726882405.75931: checking for max_fail_percentage 8454 1726882405.75933: done checking for max_fail_percentage 8454 1726882405.75937: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.75938: done checking to see if all hosts have failed 8454 1726882405.75939: getting the remaining hosts for this loop 8454 1726882405.75940: done getting the remaining hosts for this loop 8454 1726882405.75944: getting the next task for host managed_node3 8454 1726882405.75952: done getting next task for host managed_node3 8454 1726882405.75955: ^ task is: TASK: Set network provider to 'nm' 8454 1726882405.75957: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.75961: getting variables 8454 1726882405.75962: in VariableManager get_vars() 8454 1726882405.75999: Calling all_inventory to load vars for managed_node3 8454 1726882405.76002: Calling groups_inventory to load vars for managed_node3 8454 1726882405.76006: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.76015: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.76019: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.76023: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.76308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.76607: done with get_vars() 8454 1726882405.76640: done getting variables 8454 1726882405.76706: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Friday 20 September 2024 21:33:25 -0400 (0:00:00.020) 0:00:03.785 ****** 8454 1726882405.76754: entering _queue_task() for managed_node3/set_fact 8454 1726882405.77353: worker is 1 (out of 1 available) 8454 1726882405.77363: exiting _queue_task() for managed_node3/set_fact 8454 1726882405.77376: done queuing things up, now waiting for results queue to drain 8454 1726882405.77378: waiting for pending results... 8454 1726882405.77726: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 8454 1726882405.77739: in run() - task 0affe814-3a2d-f59f-16b9-000000000007 8454 1726882405.77746: variable 'ansible_search_path' from source: unknown 8454 1726882405.77749: calling self._execute() 8454 1726882405.77786: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.77794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.77806: variable 'omit' from source: magic vars 8454 1726882405.77929: variable 'omit' from source: magic vars 8454 1726882405.77966: variable 'omit' from source: magic vars 8454 1726882405.78040: variable 'omit' from source: magic vars 8454 1726882405.78055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882405.78099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882405.78120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882405.78151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882405.78155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882405.78260: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882405.78263: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.78267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.78321: Set connection var ansible_connection to ssh 8454 1726882405.78332: Set connection var ansible_shell_executable to /bin/sh 8454 1726882405.78341: Set connection var ansible_timeout to 10 8454 1726882405.78344: Set connection var ansible_shell_type to sh 8454 1726882405.78356: Set connection var ansible_pipelining to False 8454 1726882405.78364: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882405.78440: variable 'ansible_shell_executable' from source: unknown 8454 1726882405.78443: variable 'ansible_connection' from source: unknown 8454 1726882405.78446: variable 'ansible_module_compression' from source: unknown 8454 1726882405.78449: variable 'ansible_shell_type' from source: unknown 8454 1726882405.78451: variable 'ansible_shell_executable' from source: unknown 8454 1726882405.78454: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.78456: variable 'ansible_pipelining' from source: unknown 8454 1726882405.78458: variable 'ansible_timeout' from source: unknown 8454 1726882405.78460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.78580: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882405.78593: variable 'omit' from source: magic vars 8454 1726882405.78616: starting attempt loop 8454 1726882405.78619: running the handler 8454 1726882405.78622: handler run complete 8454 1726882405.78625: attempt loop complete, returning result 8454 1726882405.78725: _execute() done 8454 1726882405.78729: dumping result to json 8454 1726882405.78731: done dumping result, returning 8454 1726882405.78733: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0affe814-3a2d-f59f-16b9-000000000007] 8454 1726882405.78737: sending task result for task 0affe814-3a2d-f59f-16b9-000000000007 8454 1726882405.78799: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000007 8454 1726882405.78802: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 8454 1726882405.78856: no more pending results, returning what we have 8454 1726882405.78859: results queue empty 8454 1726882405.78860: checking for any_errors_fatal 8454 1726882405.78865: done checking for any_errors_fatal 8454 1726882405.78866: checking for max_fail_percentage 8454 1726882405.78867: done checking for max_fail_percentage 8454 1726882405.78868: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.78869: done checking to see if all hosts have failed 8454 1726882405.78870: getting the remaining hosts for this loop 8454 1726882405.78872: done getting the remaining hosts for this loop 8454 1726882405.78875: getting the next task for host managed_node3 8454 1726882405.78881: done getting next task for host managed_node3 8454 1726882405.78883: ^ task is: TASK: meta (flush_handlers) 8454 1726882405.78885: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.78891: getting variables 8454 1726882405.78892: in VariableManager get_vars() 8454 1726882405.78917: Calling all_inventory to load vars for managed_node3 8454 1726882405.78920: Calling groups_inventory to load vars for managed_node3 8454 1726882405.78924: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.78941: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.78946: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.78950: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.79178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.79465: done with get_vars() 8454 1726882405.79476: done getting variables 8454 1726882405.79554: in VariableManager get_vars() 8454 1726882405.79565: Calling all_inventory to load vars for managed_node3 8454 1726882405.79567: Calling groups_inventory to load vars for managed_node3 8454 1726882405.79571: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.79575: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.79578: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.79582: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.79999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.80259: done with get_vars() 8454 1726882405.80274: done queuing things up, now waiting for results queue to drain 8454 1726882405.80276: results queue empty 8454 1726882405.80277: checking for any_errors_fatal 8454 1726882405.80279: done checking for any_errors_fatal 8454 1726882405.80280: checking for max_fail_percentage 8454 1726882405.80281: done checking for max_fail_percentage 8454 1726882405.80282: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.80283: done checking to see if all hosts have failed 8454 1726882405.80284: getting the remaining hosts for this loop 8454 1726882405.80285: done getting the remaining hosts for this loop 8454 1726882405.80289: getting the next task for host managed_node3 8454 1726882405.80293: done getting next task for host managed_node3 8454 1726882405.80295: ^ task is: TASK: meta (flush_handlers) 8454 1726882405.80296: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.80303: getting variables 8454 1726882405.80305: in VariableManager get_vars() 8454 1726882405.80313: Calling all_inventory to load vars for managed_node3 8454 1726882405.80316: Calling groups_inventory to load vars for managed_node3 8454 1726882405.80319: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.80324: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.80327: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.80330: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.80526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.80890: done with get_vars() 8454 1726882405.80987: done getting variables 8454 1726882405.81051: in VariableManager get_vars() 8454 1726882405.81062: Calling all_inventory to load vars for managed_node3 8454 1726882405.81064: Calling groups_inventory to load vars for managed_node3 8454 1726882405.81068: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.81073: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.81076: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.81080: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.81278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.81581: done with get_vars() 8454 1726882405.81594: done queuing things up, now waiting for results queue to drain 8454 1726882405.81596: results queue empty 8454 1726882405.81597: checking for any_errors_fatal 8454 1726882405.81599: done checking for any_errors_fatal 8454 1726882405.81600: checking for max_fail_percentage 8454 1726882405.81601: done checking for max_fail_percentage 8454 1726882405.81602: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.81603: done checking to see if all hosts have failed 8454 1726882405.81604: getting the remaining hosts for this loop 8454 1726882405.81605: done getting the remaining hosts for this loop 8454 1726882405.81607: getting the next task for host managed_node3 8454 1726882405.81611: done getting next task for host managed_node3 8454 1726882405.81612: ^ task is: None 8454 1726882405.81613: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.81615: done queuing things up, now waiting for results queue to drain 8454 1726882405.81616: results queue empty 8454 1726882405.81617: checking for any_errors_fatal 8454 1726882405.81618: done checking for any_errors_fatal 8454 1726882405.81619: checking for max_fail_percentage 8454 1726882405.81620: done checking for max_fail_percentage 8454 1726882405.81621: checking to see if all hosts have failed and the running result is not ok 8454 1726882405.81622: done checking to see if all hosts have failed 8454 1726882405.81623: getting the next task for host managed_node3 8454 1726882405.81626: done getting next task for host managed_node3 8454 1726882405.81627: ^ task is: None 8454 1726882405.81629: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.81676: in VariableManager get_vars() 8454 1726882405.81703: done with get_vars() 8454 1726882405.81711: in VariableManager get_vars() 8454 1726882405.81730: done with get_vars() 8454 1726882405.81737: variable 'omit' from source: magic vars 8454 1726882405.81786: in VariableManager get_vars() 8454 1726882405.81807: done with get_vars() 8454 1726882405.81831: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 8454 1726882405.82814: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 8454 1726882405.82842: getting the remaining hosts for this loop 8454 1726882405.82843: done getting the remaining hosts for this loop 8454 1726882405.82846: getting the next task for host managed_node3 8454 1726882405.82850: done getting next task for host managed_node3 8454 1726882405.82852: ^ task is: TASK: Gathering Facts 8454 1726882405.82854: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882405.82856: getting variables 8454 1726882405.82857: in VariableManager get_vars() 8454 1726882405.82879: Calling all_inventory to load vars for managed_node3 8454 1726882405.82882: Calling groups_inventory to load vars for managed_node3 8454 1726882405.82885: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882405.82891: Calling all_plugins_play to load vars for managed_node3 8454 1726882405.82907: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882405.82911: Calling groups_plugins_play to load vars for managed_node3 8454 1726882405.83112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882405.83400: done with get_vars() 8454 1726882405.83417: done getting variables 8454 1726882405.83464: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Friday 20 September 2024 21:33:25 -0400 (0:00:00.067) 0:00:03.852 ****** 8454 1726882405.83489: entering _queue_task() for managed_node3/gather_facts 8454 1726882405.83748: worker is 1 (out of 1 available) 8454 1726882405.83760: exiting _queue_task() for managed_node3/gather_facts 8454 1726882405.83776: done queuing things up, now waiting for results queue to drain 8454 1726882405.83778: waiting for pending results... 8454 1726882405.84116: running TaskExecutor() for managed_node3/TASK: Gathering Facts 8454 1726882405.84230: in run() - task 0affe814-3a2d-f59f-16b9-000000000128 8454 1726882405.84303: variable 'ansible_search_path' from source: unknown 8454 1726882405.84319: calling self._execute() 8454 1726882405.84429: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.84445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.84463: variable 'omit' from source: magic vars 8454 1726882405.84933: variable 'ansible_distribution_major_version' from source: facts 8454 1726882405.84965: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882405.85040: variable 'omit' from source: magic vars 8454 1726882405.85043: variable 'omit' from source: magic vars 8454 1726882405.85072: variable 'omit' from source: magic vars 8454 1726882405.85121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882405.85182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882405.85211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882405.85245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882405.85274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882405.85321: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882405.85331: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.85343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.85494: Set connection var ansible_connection to ssh 8454 1726882405.85542: Set connection var ansible_shell_executable to /bin/sh 8454 1726882405.85545: Set connection var ansible_timeout to 10 8454 1726882405.85548: Set connection var ansible_shell_type to sh 8454 1726882405.85550: Set connection var ansible_pipelining to False 8454 1726882405.85558: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882405.85603: variable 'ansible_shell_executable' from source: unknown 8454 1726882405.85606: variable 'ansible_connection' from source: unknown 8454 1726882405.85610: variable 'ansible_module_compression' from source: unknown 8454 1726882405.85666: variable 'ansible_shell_type' from source: unknown 8454 1726882405.85669: variable 'ansible_shell_executable' from source: unknown 8454 1726882405.85672: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882405.85674: variable 'ansible_pipelining' from source: unknown 8454 1726882405.85676: variable 'ansible_timeout' from source: unknown 8454 1726882405.85678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882405.85897: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882405.85918: variable 'omit' from source: magic vars 8454 1726882405.85941: starting attempt loop 8454 1726882405.85950: running the handler 8454 1726882405.85991: variable 'ansible_facts' from source: unknown 8454 1726882405.86003: _low_level_execute_command(): starting 8454 1726882405.86017: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882405.86863: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882405.86960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882405.86968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882405.87119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882405.89562: stdout chunk (state=3): >>>/root <<< 8454 1726882405.89721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882405.89753: stderr chunk (state=3): >>><<< 8454 1726882405.89757: stdout chunk (state=3): >>><<< 8454 1726882405.89780: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882405.89794: _low_level_execute_command(): starting 8454 1726882405.89800: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956 `" && echo ansible-tmp-1726882405.8978212-8624-191204422007956="` echo /root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956 `" ) && sleep 0' 8454 1726882405.90216: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882405.90259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882405.90264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882405.90267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882405.90277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882405.90314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882405.90321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882405.90443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882405.93399: stdout chunk (state=3): >>>ansible-tmp-1726882405.8978212-8624-191204422007956=/root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956 <<< 8454 1726882405.93577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882405.93632: stderr chunk (state=3): >>><<< 8454 1726882405.93636: stdout chunk (state=3): >>><<< 8454 1726882405.93656: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882405.8978212-8624-191204422007956=/root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882405.93700: variable 'ansible_module_compression' from source: unknown 8454 1726882405.93841: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8454 1726882405.93844: variable 'ansible_facts' from source: unknown 8454 1726882405.94007: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/AnsiballZ_setup.py 8454 1726882405.94257: Sending initial data 8454 1726882405.94345: Sent initial data (152 bytes) 8454 1726882405.94951: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882405.94989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882405.95004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882405.95030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882405.95197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882405.97585: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882405.97711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882405.97838: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpe5swdgnw /root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/AnsiballZ_setup.py <<< 8454 1726882405.97842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/AnsiballZ_setup.py" <<< 8454 1726882405.97984: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpe5swdgnw" to remote "/root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/AnsiballZ_setup.py" <<< 8454 1726882406.00666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882406.00675: stderr chunk (state=3): >>><<< 8454 1726882406.00688: stdout chunk (state=3): >>><<< 8454 1726882406.00707: done transferring module to remote 8454 1726882406.00718: _low_level_execute_command(): starting 8454 1726882406.00723: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/ /root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/AnsiballZ_setup.py && sleep 0' 8454 1726882406.01144: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882406.01151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882406.01164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882406.01213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882406.01216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882406.01337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882406.04114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882406.04162: stderr chunk (state=3): >>><<< 8454 1726882406.04167: stdout chunk (state=3): >>><<< 8454 1726882406.04191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882406.04194: _low_level_execute_command(): starting 8454 1726882406.04199: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/AnsiballZ_setup.py && sleep 0' 8454 1726882406.04651: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882406.04654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882406.04657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882406.04659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882406.04722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882406.04726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882406.04884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882407.05229: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.5126953125, "5m": 0.3125, "15m": 0.16357421875}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"e<<< 8454 1726882407.05311: stdout chunk (state=3): >>>dns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2881, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 836, "free": 2881}, "nocache": {"free": 3456, "used": 261}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_uuid": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 550, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251223289856, "block_size": 4096, "block_total": 64483404, "block_available": 61333811, "block_used": 3149593, "inode_total": 16384000, "inode_available": 16303859, "inode_used": 80141, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "27", "epoch": "1726882407", "epoch_int": "1726882407", "date": "2024-09-20", "time": "21:33:27", "iso8601_micro": "2024-09-21T01:33:27.008232Z", "iso8601": "2024-09-21T01:33:27Z", "iso8601_basic": "20240920T213327008232", "iso8601_basic_short": "20240920T213327", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::a0b7:fdc4:48e8:7158", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.41.238"], "ansible_all_ipv6_addresses": ["fe80::a0b7:fdc4:48e8:7158"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.41.238", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::a0b7:fdc4:48e8:7158"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8454 1726882407.08380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882407.08384: stdout chunk (state=3): >>><<< 8454 1726882407.08386: stderr chunk (state=3): >>><<< 8454 1726882407.08503: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.5126953125, "5m": 0.3125, "15m": 0.16357421875}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2881, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 836, "free": 2881}, "nocache": {"free": 3456, "used": 261}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_uuid": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 550, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251223289856, "block_size": 4096, "block_total": 64483404, "block_available": 61333811, "block_used": 3149593, "inode_total": 16384000, "inode_available": 16303859, "inode_used": 80141, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "27", "epoch": "1726882407", "epoch_int": "1726882407", "date": "2024-09-20", "time": "21:33:27", "iso8601_micro": "2024-09-21T01:33:27.008232Z", "iso8601": "2024-09-21T01:33:27Z", "iso8601_basic": "20240920T213327008232", "iso8601_basic_short": "20240920T213327", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::a0b7:fdc4:48e8:7158", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.41.238"], "ansible_all_ipv6_addresses": ["fe80::a0b7:fdc4:48e8:7158"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.41.238", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::a0b7:fdc4:48e8:7158"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882407.08825: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882407.08861: _low_level_execute_command(): starting 8454 1726882407.08871: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882405.8978212-8624-191204422007956/ > /dev/null 2>&1 && sleep 0' 8454 1726882407.09507: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882407.09523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882407.09541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882407.09559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882407.09573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882407.09584: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882407.09603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882407.09652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882407.09712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882407.09719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882407.09740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882407.09924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882407.12985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882407.12988: stdout chunk (state=3): >>><<< 8454 1726882407.12991: stderr chunk (state=3): >>><<< 8454 1726882407.13140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882407.13144: handler run complete 8454 1726882407.13236: variable 'ansible_facts' from source: unknown 8454 1726882407.13399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882407.13891: variable 'ansible_facts' from source: unknown 8454 1726882407.14223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882407.14438: attempt loop complete, returning result 8454 1726882407.14449: _execute() done 8454 1726882407.14457: dumping result to json 8454 1726882407.14494: done dumping result, returning 8454 1726882407.14507: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affe814-3a2d-f59f-16b9-000000000128] 8454 1726882407.14527: sending task result for task 0affe814-3a2d-f59f-16b9-000000000128 ok: [managed_node3] 8454 1726882407.15630: no more pending results, returning what we have 8454 1726882407.15637: results queue empty 8454 1726882407.15638: checking for any_errors_fatal 8454 1726882407.15640: done checking for any_errors_fatal 8454 1726882407.15641: checking for max_fail_percentage 8454 1726882407.15643: done checking for max_fail_percentage 8454 1726882407.15644: checking to see if all hosts have failed and the running result is not ok 8454 1726882407.15645: done checking to see if all hosts have failed 8454 1726882407.15646: getting the remaining hosts for this loop 8454 1726882407.15648: done getting the remaining hosts for this loop 8454 1726882407.15652: getting the next task for host managed_node3 8454 1726882407.15657: done getting next task for host managed_node3 8454 1726882407.15659: ^ task is: TASK: meta (flush_handlers) 8454 1726882407.15661: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882407.15665: getting variables 8454 1726882407.15667: in VariableManager get_vars() 8454 1726882407.15712: Calling all_inventory to load vars for managed_node3 8454 1726882407.15716: Calling groups_inventory to load vars for managed_node3 8454 1726882407.15719: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882407.15726: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000128 8454 1726882407.15729: WORKER PROCESS EXITING 8454 1726882407.15742: Calling all_plugins_play to load vars for managed_node3 8454 1726882407.15746: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882407.15750: Calling groups_plugins_play to load vars for managed_node3 8454 1726882407.15977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882407.16285: done with get_vars() 8454 1726882407.16297: done getting variables 8454 1726882407.16389: in VariableManager get_vars() 8454 1726882407.16407: Calling all_inventory to load vars for managed_node3 8454 1726882407.16410: Calling groups_inventory to load vars for managed_node3 8454 1726882407.16413: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882407.16418: Calling all_plugins_play to load vars for managed_node3 8454 1726882407.16422: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882407.16426: Calling groups_plugins_play to load vars for managed_node3 8454 1726882407.16673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882407.16971: done with get_vars() 8454 1726882407.16988: done queuing things up, now waiting for results queue to drain 8454 1726882407.16997: results queue empty 8454 1726882407.16999: checking for any_errors_fatal 8454 1726882407.17006: done checking for any_errors_fatal 8454 1726882407.17008: checking for max_fail_percentage 8454 1726882407.17009: done checking for max_fail_percentage 8454 1726882407.17010: checking to see if all hosts have failed and the running result is not ok 8454 1726882407.17011: done checking to see if all hosts have failed 8454 1726882407.17011: getting the remaining hosts for this loop 8454 1726882407.17013: done getting the remaining hosts for this loop 8454 1726882407.17015: getting the next task for host managed_node3 8454 1726882407.17020: done getting next task for host managed_node3 8454 1726882407.17023: ^ task is: TASK: INIT Prepare setup 8454 1726882407.17025: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882407.17028: getting variables 8454 1726882407.17029: in VariableManager get_vars() 8454 1726882407.17050: Calling all_inventory to load vars for managed_node3 8454 1726882407.17052: Calling groups_inventory to load vars for managed_node3 8454 1726882407.17055: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882407.17061: Calling all_plugins_play to load vars for managed_node3 8454 1726882407.17064: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882407.17068: Calling groups_plugins_play to load vars for managed_node3 8454 1726882407.17273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882407.17567: done with get_vars() 8454 1726882407.17576: done getting variables 8454 1726882407.17667: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Friday 20 September 2024 21:33:27 -0400 (0:00:01.342) 0:00:05.194 ****** 8454 1726882407.17698: entering _queue_task() for managed_node3/debug 8454 1726882407.17700: Creating lock for debug 8454 1726882407.18052: worker is 1 (out of 1 available) 8454 1726882407.18064: exiting _queue_task() for managed_node3/debug 8454 1726882407.18074: done queuing things up, now waiting for results queue to drain 8454 1726882407.18076: waiting for pending results... 8454 1726882407.18291: running TaskExecutor() for managed_node3/TASK: INIT Prepare setup 8454 1726882407.18416: in run() - task 0affe814-3a2d-f59f-16b9-00000000000b 8454 1726882407.18445: variable 'ansible_search_path' from source: unknown 8454 1726882407.18495: calling self._execute() 8454 1726882407.18600: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882407.18615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882407.18646: variable 'omit' from source: magic vars 8454 1726882407.19217: variable 'ansible_distribution_major_version' from source: facts 8454 1726882407.19238: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882407.19251: variable 'omit' from source: magic vars 8454 1726882407.19291: variable 'omit' from source: magic vars 8454 1726882407.19400: variable 'omit' from source: magic vars 8454 1726882407.19404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882407.19446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882407.19474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882407.19511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882407.19538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882407.19582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882407.19617: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882407.19620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882407.19752: Set connection var ansible_connection to ssh 8454 1726882407.19835: Set connection var ansible_shell_executable to /bin/sh 8454 1726882407.19843: Set connection var ansible_timeout to 10 8454 1726882407.19846: Set connection var ansible_shell_type to sh 8454 1726882407.19848: Set connection var ansible_pipelining to False 8454 1726882407.19850: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882407.19852: variable 'ansible_shell_executable' from source: unknown 8454 1726882407.19854: variable 'ansible_connection' from source: unknown 8454 1726882407.19856: variable 'ansible_module_compression' from source: unknown 8454 1726882407.19858: variable 'ansible_shell_type' from source: unknown 8454 1726882407.19860: variable 'ansible_shell_executable' from source: unknown 8454 1726882407.19869: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882407.19877: variable 'ansible_pipelining' from source: unknown 8454 1726882407.19887: variable 'ansible_timeout' from source: unknown 8454 1726882407.19895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882407.20078: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882407.20101: variable 'omit' from source: magic vars 8454 1726882407.20113: starting attempt loop 8454 1726882407.20121: running the handler 8454 1726882407.20240: handler run complete 8454 1726882407.20243: attempt loop complete, returning result 8454 1726882407.20246: _execute() done 8454 1726882407.20248: dumping result to json 8454 1726882407.20251: done dumping result, returning 8454 1726882407.20254: done running TaskExecutor() for managed_node3/TASK: INIT Prepare setup [0affe814-3a2d-f59f-16b9-00000000000b] 8454 1726882407.20260: sending task result for task 0affe814-3a2d-f59f-16b9-00000000000b ok: [managed_node3] => {} MSG: ################################################## 8454 1726882407.20499: no more pending results, returning what we have 8454 1726882407.20503: results queue empty 8454 1726882407.20504: checking for any_errors_fatal 8454 1726882407.20506: done checking for any_errors_fatal 8454 1726882407.20507: checking for max_fail_percentage 8454 1726882407.20509: done checking for max_fail_percentage 8454 1726882407.20511: checking to see if all hosts have failed and the running result is not ok 8454 1726882407.20512: done checking to see if all hosts have failed 8454 1726882407.20513: getting the remaining hosts for this loop 8454 1726882407.20515: done getting the remaining hosts for this loop 8454 1726882407.20519: getting the next task for host managed_node3 8454 1726882407.20527: done getting next task for host managed_node3 8454 1726882407.20531: ^ task is: TASK: Install dnsmasq 8454 1726882407.20536: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882407.20541: getting variables 8454 1726882407.20543: in VariableManager get_vars() 8454 1726882407.20769: Calling all_inventory to load vars for managed_node3 8454 1726882407.20773: Calling groups_inventory to load vars for managed_node3 8454 1726882407.20776: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882407.20789: Calling all_plugins_play to load vars for managed_node3 8454 1726882407.20793: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882407.20797: Calling groups_plugins_play to load vars for managed_node3 8454 1726882407.21067: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000000b 8454 1726882407.21070: WORKER PROCESS EXITING 8454 1726882407.21099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882407.21401: done with get_vars() 8454 1726882407.21412: done getting variables 8454 1726882407.21487: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:27 -0400 (0:00:00.038) 0:00:05.232 ****** 8454 1726882407.21520: entering _queue_task() for managed_node3/package 8454 1726882407.21844: worker is 1 (out of 1 available) 8454 1726882407.21858: exiting _queue_task() for managed_node3/package 8454 1726882407.21868: done queuing things up, now waiting for results queue to drain 8454 1726882407.21870: waiting for pending results... 8454 1726882407.22083: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 8454 1726882407.22239: in run() - task 0affe814-3a2d-f59f-16b9-00000000000f 8454 1726882407.22260: variable 'ansible_search_path' from source: unknown 8454 1726882407.22269: variable 'ansible_search_path' from source: unknown 8454 1726882407.22321: calling self._execute() 8454 1726882407.22440: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882407.22445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882407.22526: variable 'omit' from source: magic vars 8454 1726882407.22930: variable 'ansible_distribution_major_version' from source: facts 8454 1726882407.22953: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882407.22974: variable 'omit' from source: magic vars 8454 1726882407.23048: variable 'omit' from source: magic vars 8454 1726882407.23344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882407.26561: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882407.26719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882407.26999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882407.27007: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882407.27050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882407.27439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882407.27443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882407.27446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882407.27658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882407.27683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882407.27988: variable '__network_is_ostree' from source: set_fact 8454 1726882407.28064: variable 'omit' from source: magic vars 8454 1726882407.28104: variable 'omit' from source: magic vars 8454 1726882407.28212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882407.28307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882407.28311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882407.28415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882407.28438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882407.28525: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882407.28528: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882407.28531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882407.29042: Set connection var ansible_connection to ssh 8454 1726882407.29046: Set connection var ansible_shell_executable to /bin/sh 8454 1726882407.29048: Set connection var ansible_timeout to 10 8454 1726882407.29051: Set connection var ansible_shell_type to sh 8454 1726882407.29054: Set connection var ansible_pipelining to False 8454 1726882407.29056: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882407.29058: variable 'ansible_shell_executable' from source: unknown 8454 1726882407.29342: variable 'ansible_connection' from source: unknown 8454 1726882407.29347: variable 'ansible_module_compression' from source: unknown 8454 1726882407.29350: variable 'ansible_shell_type' from source: unknown 8454 1726882407.29352: variable 'ansible_shell_executable' from source: unknown 8454 1726882407.29355: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882407.29357: variable 'ansible_pipelining' from source: unknown 8454 1726882407.29359: variable 'ansible_timeout' from source: unknown 8454 1726882407.29361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882407.29616: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882407.29854: variable 'omit' from source: magic vars 8454 1726882407.29857: starting attempt loop 8454 1726882407.29860: running the handler 8454 1726882407.29863: variable 'ansible_facts' from source: unknown 8454 1726882407.29874: variable 'ansible_facts' from source: unknown 8454 1726882407.30202: _low_level_execute_command(): starting 8454 1726882407.30207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882407.31554: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882407.31603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882407.31697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882407.31718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882407.32263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882407.33936: stdout chunk (state=3): >>>/root <<< 8454 1726882407.34235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882407.34242: stdout chunk (state=3): >>><<< 8454 1726882407.34253: stderr chunk (state=3): >>><<< 8454 1726882407.34274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882407.34292: _low_level_execute_command(): starting 8454 1726882407.34299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824 `" && echo ansible-tmp-1726882407.3427513-8674-63141737936824="` echo /root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824 `" ) && sleep 0' 8454 1726882407.35702: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882407.35812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882407.36035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 8454 1726882407.38677: stdout chunk (state=3): >>>ansible-tmp-1726882407.3427513-8674-63141737936824=/root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824 <<< 8454 1726882407.38804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882407.38873: stderr chunk (state=3): >>><<< 8454 1726882407.38881: stdout chunk (state=3): >>><<< 8454 1726882407.39046: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882407.3427513-8674-63141737936824=/root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 8454 1726882407.39051: variable 'ansible_module_compression' from source: unknown 8454 1726882407.39055: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 8454 1726882407.39058: ANSIBALLZ: Acquiring lock 8454 1726882407.39060: ANSIBALLZ: Lock acquired: 140055527345136 8454 1726882407.39062: ANSIBALLZ: Creating module 8454 1726882407.73860: ANSIBALLZ: Writing module into payload 8454 1726882407.74165: ANSIBALLZ: Writing module 8454 1726882407.74190: ANSIBALLZ: Renaming module 8454 1726882407.74197: ANSIBALLZ: Done creating module 8454 1726882407.74216: variable 'ansible_facts' from source: unknown 8454 1726882407.74315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/AnsiballZ_dnf.py 8454 1726882407.74499: Sending initial data 8454 1726882407.74511: Sent initial data (149 bytes) 8454 1726882407.75251: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882407.75317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882407.75378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882407.75573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882407.77314: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8454 1726882407.77346: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882407.77447: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882407.77599: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpgq710o13 /root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/AnsiballZ_dnf.py <<< 8454 1726882407.77602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/AnsiballZ_dnf.py" <<< 8454 1726882407.77712: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpgq710o13" to remote "/root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/AnsiballZ_dnf.py" <<< 8454 1726882407.79771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882407.79774: stdout chunk (state=3): >>><<< 8454 1726882407.79777: stderr chunk (state=3): >>><<< 8454 1726882407.79782: done transferring module to remote 8454 1726882407.79784: _low_level_execute_command(): starting 8454 1726882407.79787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/ /root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/AnsiballZ_dnf.py && sleep 0' 8454 1726882407.80373: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882407.80389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882407.80409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882407.80428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882407.80494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882407.80557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882407.80575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882407.80605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882407.80748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882407.82813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882407.82816: stdout chunk (state=3): >>><<< 8454 1726882407.82819: stderr chunk (state=3): >>><<< 8454 1726882407.82940: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882407.82943: _low_level_execute_command(): starting 8454 1726882407.82947: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/AnsiballZ_dnf.py && sleep 0' 8454 1726882407.83546: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882407.83562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882407.83698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882407.83702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882407.83726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882407.83884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882409.29936: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 8454 1726882409.34711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882409.34775: stderr chunk (state=3): >>><<< 8454 1726882409.34780: stdout chunk (state=3): >>><<< 8454 1726882409.34799: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882409.34856: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882409.34864: _low_level_execute_command(): starting 8454 1726882409.34872: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882407.3427513-8674-63141737936824/ > /dev/null 2>&1 && sleep 0' 8454 1726882409.35341: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882409.35344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882409.35347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882409.35350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882409.35400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882409.35404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882409.35516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882409.37741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882409.37745: stdout chunk (state=3): >>><<< 8454 1726882409.37747: stderr chunk (state=3): >>><<< 8454 1726882409.37750: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882409.37753: handler run complete 8454 1726882409.37859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882409.38089: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882409.38127: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882409.38166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882409.38201: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882409.38272: variable '__install_status' from source: unknown 8454 1726882409.38292: Evaluated conditional (__install_status is success): True 8454 1726882409.38311: attempt loop complete, returning result 8454 1726882409.38314: _execute() done 8454 1726882409.38317: dumping result to json 8454 1726882409.38323: done dumping result, returning 8454 1726882409.38331: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0affe814-3a2d-f59f-16b9-00000000000f] 8454 1726882409.38338: sending task result for task 0affe814-3a2d-f59f-16b9-00000000000f 8454 1726882409.38489: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000000f 8454 1726882409.38492: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8454 1726882409.38713: no more pending results, returning what we have 8454 1726882409.38717: results queue empty 8454 1726882409.38718: checking for any_errors_fatal 8454 1726882409.38724: done checking for any_errors_fatal 8454 1726882409.38725: checking for max_fail_percentage 8454 1726882409.38727: done checking for max_fail_percentage 8454 1726882409.38728: checking to see if all hosts have failed and the running result is not ok 8454 1726882409.38729: done checking to see if all hosts have failed 8454 1726882409.38730: getting the remaining hosts for this loop 8454 1726882409.38731: done getting the remaining hosts for this loop 8454 1726882409.38748: getting the next task for host managed_node3 8454 1726882409.38756: done getting next task for host managed_node3 8454 1726882409.38758: ^ task is: TASK: Install pgrep, sysctl 8454 1726882409.38762: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882409.38765: getting variables 8454 1726882409.38767: in VariableManager get_vars() 8454 1726882409.38802: Calling all_inventory to load vars for managed_node3 8454 1726882409.38805: Calling groups_inventory to load vars for managed_node3 8454 1726882409.38807: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882409.38815: Calling all_plugins_play to load vars for managed_node3 8454 1726882409.38818: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882409.38827: Calling groups_plugins_play to load vars for managed_node3 8454 1726882409.39041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882409.39341: done with get_vars() 8454 1726882409.39363: done getting variables 8454 1726882409.39443: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:33:29 -0400 (0:00:02.179) 0:00:07.412 ****** 8454 1726882409.39485: entering _queue_task() for managed_node3/package 8454 1726882409.39968: worker is 1 (out of 1 available) 8454 1726882409.39981: exiting _queue_task() for managed_node3/package 8454 1726882409.39994: done queuing things up, now waiting for results queue to drain 8454 1726882409.39996: waiting for pending results... 8454 1726882409.40109: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 8454 1726882409.40214: in run() - task 0affe814-3a2d-f59f-16b9-000000000010 8454 1726882409.40231: variable 'ansible_search_path' from source: unknown 8454 1726882409.40238: variable 'ansible_search_path' from source: unknown 8454 1726882409.40271: calling self._execute() 8454 1726882409.40354: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882409.40361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882409.40371: variable 'omit' from source: magic vars 8454 1726882409.40688: variable 'ansible_distribution_major_version' from source: facts 8454 1726882409.40700: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882409.40799: variable 'ansible_os_family' from source: facts 8454 1726882409.40806: Evaluated conditional (ansible_os_family == 'RedHat'): True 8454 1726882409.40952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882409.41181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882409.41224: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882409.41255: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882409.41286: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882409.41444: variable 'ansible_distribution_major_version' from source: facts 8454 1726882409.41448: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 8454 1726882409.41451: when evaluation is False, skipping this task 8454 1726882409.41453: _execute() done 8454 1726882409.41455: dumping result to json 8454 1726882409.41458: done dumping result, returning 8454 1726882409.41460: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affe814-3a2d-f59f-16b9-000000000010] 8454 1726882409.41462: sending task result for task 0affe814-3a2d-f59f-16b9-000000000010 8454 1726882409.41543: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000010 8454 1726882409.41546: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 8454 1726882409.41694: no more pending results, returning what we have 8454 1726882409.41699: results queue empty 8454 1726882409.41700: checking for any_errors_fatal 8454 1726882409.41706: done checking for any_errors_fatal 8454 1726882409.41707: checking for max_fail_percentage 8454 1726882409.41709: done checking for max_fail_percentage 8454 1726882409.41710: checking to see if all hosts have failed and the running result is not ok 8454 1726882409.41711: done checking to see if all hosts have failed 8454 1726882409.41711: getting the remaining hosts for this loop 8454 1726882409.41713: done getting the remaining hosts for this loop 8454 1726882409.41718: getting the next task for host managed_node3 8454 1726882409.41724: done getting next task for host managed_node3 8454 1726882409.41727: ^ task is: TASK: Install pgrep, sysctl 8454 1726882409.41730: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882409.41738: getting variables 8454 1726882409.41739: in VariableManager get_vars() 8454 1726882409.41789: Calling all_inventory to load vars for managed_node3 8454 1726882409.41792: Calling groups_inventory to load vars for managed_node3 8454 1726882409.41795: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882409.41808: Calling all_plugins_play to load vars for managed_node3 8454 1726882409.41811: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882409.41814: Calling groups_plugins_play to load vars for managed_node3 8454 1726882409.42477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882409.43111: done with get_vars() 8454 1726882409.43123: done getting variables 8454 1726882409.43259: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:33:29 -0400 (0:00:00.038) 0:00:07.450 ****** 8454 1726882409.43294: entering _queue_task() for managed_node3/package 8454 1726882409.44002: worker is 1 (out of 1 available) 8454 1726882409.44016: exiting _queue_task() for managed_node3/package 8454 1726882409.44028: done queuing things up, now waiting for results queue to drain 8454 1726882409.44030: waiting for pending results... 8454 1726882409.44452: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 8454 1726882409.44456: in run() - task 0affe814-3a2d-f59f-16b9-000000000011 8454 1726882409.44461: variable 'ansible_search_path' from source: unknown 8454 1726882409.44464: variable 'ansible_search_path' from source: unknown 8454 1726882409.44485: calling self._execute() 8454 1726882409.44586: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882409.44602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882409.44621: variable 'omit' from source: magic vars 8454 1726882409.45085: variable 'ansible_distribution_major_version' from source: facts 8454 1726882409.45108: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882409.45338: variable 'ansible_os_family' from source: facts 8454 1726882409.45461: Evaluated conditional (ansible_os_family == 'RedHat'): True 8454 1726882409.46017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882409.46830: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882409.47015: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882409.47018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882409.47061: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882409.47161: variable 'ansible_distribution_major_version' from source: facts 8454 1726882409.47185: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 8454 1726882409.47200: variable 'omit' from source: magic vars 8454 1726882409.47270: variable 'omit' from source: magic vars 8454 1726882409.47495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882409.49986: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882409.50068: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882409.50120: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882409.50171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882409.50210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882409.50327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882409.50373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882409.50414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882409.50481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882409.50507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882409.50632: variable '__network_is_ostree' from source: set_fact 8454 1726882409.50646: variable 'omit' from source: magic vars 8454 1726882409.50688: variable 'omit' from source: magic vars 8454 1726882409.50725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882409.50766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882409.50904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882409.50907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882409.50910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882409.50912: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882409.50915: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882409.50917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882409.51032: Set connection var ansible_connection to ssh 8454 1726882409.51053: Set connection var ansible_shell_executable to /bin/sh 8454 1726882409.51066: Set connection var ansible_timeout to 10 8454 1726882409.51074: Set connection var ansible_shell_type to sh 8454 1726882409.51091: Set connection var ansible_pipelining to False 8454 1726882409.51103: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882409.51141: variable 'ansible_shell_executable' from source: unknown 8454 1726882409.51150: variable 'ansible_connection' from source: unknown 8454 1726882409.51158: variable 'ansible_module_compression' from source: unknown 8454 1726882409.51166: variable 'ansible_shell_type' from source: unknown 8454 1726882409.51173: variable 'ansible_shell_executable' from source: unknown 8454 1726882409.51185: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882409.51195: variable 'ansible_pipelining' from source: unknown 8454 1726882409.51202: variable 'ansible_timeout' from source: unknown 8454 1726882409.51212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882409.51330: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882409.51354: variable 'omit' from source: magic vars 8454 1726882409.51364: starting attempt loop 8454 1726882409.51371: running the handler 8454 1726882409.51385: variable 'ansible_facts' from source: unknown 8454 1726882409.51451: variable 'ansible_facts' from source: unknown 8454 1726882409.51455: _low_level_execute_command(): starting 8454 1726882409.51457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882409.52187: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882409.52209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882409.52332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882409.52368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882409.52527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882409.54350: stdout chunk (state=3): >>>/root <<< 8454 1726882409.54553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882409.54556: stdout chunk (state=3): >>><<< 8454 1726882409.54559: stderr chunk (state=3): >>><<< 8454 1726882409.54684: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882409.54687: _low_level_execute_command(): starting 8454 1726882409.54691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215 `" && echo ansible-tmp-1726882409.5458589-8764-172517739351215="` echo /root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215 `" ) && sleep 0' 8454 1726882409.55260: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882409.55284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882409.55301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882409.55323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882409.55351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882409.55391: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882409.55408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882409.55497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882409.55517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882409.55543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882409.55698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882409.57796: stdout chunk (state=3): >>>ansible-tmp-1726882409.5458589-8764-172517739351215=/root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215 <<< 8454 1726882409.58004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882409.58008: stdout chunk (state=3): >>><<< 8454 1726882409.58010: stderr chunk (state=3): >>><<< 8454 1726882409.58026: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882409.5458589-8764-172517739351215=/root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882409.58069: variable 'ansible_module_compression' from source: unknown 8454 1726882409.58192: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 8454 1726882409.58195: variable 'ansible_facts' from source: unknown 8454 1726882409.58309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/AnsiballZ_dnf.py 8454 1726882409.58598: Sending initial data 8454 1726882409.58601: Sent initial data (150 bytes) 8454 1726882409.59262: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882409.59277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882409.59301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882409.59438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882409.61138: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882409.61290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882409.61422: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpg4w9iboj /root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/AnsiballZ_dnf.py <<< 8454 1726882409.61426: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/AnsiballZ_dnf.py" <<< 8454 1726882409.61529: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpg4w9iboj" to remote "/root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/AnsiballZ_dnf.py" <<< 8454 1726882409.63542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882409.63546: stdout chunk (state=3): >>><<< 8454 1726882409.63548: stderr chunk (state=3): >>><<< 8454 1726882409.63551: done transferring module to remote 8454 1726882409.63553: _low_level_execute_command(): starting 8454 1726882409.63556: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/ /root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/AnsiballZ_dnf.py && sleep 0' 8454 1726882409.64238: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882409.64343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882409.64377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882409.64398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882409.64421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882409.64582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882409.66537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882409.66545: stderr chunk (state=3): >>><<< 8454 1726882409.66548: stdout chunk (state=3): >>><<< 8454 1726882409.66562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882409.66567: _low_level_execute_command(): starting 8454 1726882409.66628: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/AnsiballZ_dnf.py && sleep 0' 8454 1726882409.67020: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882409.67023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882409.67026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882409.67028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882409.67095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882409.67098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882409.67213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882411.13761: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 8454 1726882411.18745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882411.18749: stdout chunk (state=3): >>><<< 8454 1726882411.18752: stderr chunk (state=3): >>><<< 8454 1726882411.18754: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882411.18762: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882411.18774: _low_level_execute_command(): starting 8454 1726882411.18791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882409.5458589-8764-172517739351215/ > /dev/null 2>&1 && sleep 0' 8454 1726882411.19701: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882411.19751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882411.19764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882411.19790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882411.19875: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.19900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882411.19920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882411.20071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882411.22152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882411.22169: stdout chunk (state=3): >>><<< 8454 1726882411.22184: stderr chunk (state=3): >>><<< 8454 1726882411.22204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882411.22216: handler run complete 8454 1726882411.22291: attempt loop complete, returning result 8454 1726882411.22338: _execute() done 8454 1726882411.22341: dumping result to json 8454 1726882411.22344: done dumping result, returning 8454 1726882411.22346: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affe814-3a2d-f59f-16b9-000000000011] 8454 1726882411.22348: sending task result for task 0affe814-3a2d-f59f-16b9-000000000011 ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8454 1726882411.22575: no more pending results, returning what we have 8454 1726882411.22584: results queue empty 8454 1726882411.22585: checking for any_errors_fatal 8454 1726882411.22594: done checking for any_errors_fatal 8454 1726882411.22595: checking for max_fail_percentage 8454 1726882411.22597: done checking for max_fail_percentage 8454 1726882411.22599: checking to see if all hosts have failed and the running result is not ok 8454 1726882411.22600: done checking to see if all hosts have failed 8454 1726882411.22601: getting the remaining hosts for this loop 8454 1726882411.22603: done getting the remaining hosts for this loop 8454 1726882411.22608: getting the next task for host managed_node3 8454 1726882411.22615: done getting next task for host managed_node3 8454 1726882411.22618: ^ task is: TASK: Create test interfaces 8454 1726882411.22622: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882411.22626: getting variables 8454 1726882411.22627: in VariableManager get_vars() 8454 1726882411.22892: Calling all_inventory to load vars for managed_node3 8454 1726882411.22896: Calling groups_inventory to load vars for managed_node3 8454 1726882411.22899: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882411.22942: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000011 8454 1726882411.22946: WORKER PROCESS EXITING 8454 1726882411.22964: Calling all_plugins_play to load vars for managed_node3 8454 1726882411.22968: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882411.22972: Calling groups_plugins_play to load vars for managed_node3 8454 1726882411.23377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882411.23674: done with get_vars() 8454 1726882411.23690: done getting variables 8454 1726882411.23809: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:33:31 -0400 (0:00:01.805) 0:00:09.256 ****** 8454 1726882411.23853: entering _queue_task() for managed_node3/shell 8454 1726882411.23855: Creating lock for shell 8454 1726882411.24259: worker is 1 (out of 1 available) 8454 1726882411.24274: exiting _queue_task() for managed_node3/shell 8454 1726882411.24289: done queuing things up, now waiting for results queue to drain 8454 1726882411.24290: waiting for pending results... 8454 1726882411.24488: running TaskExecutor() for managed_node3/TASK: Create test interfaces 8454 1726882411.24645: in run() - task 0affe814-3a2d-f59f-16b9-000000000012 8454 1726882411.24667: variable 'ansible_search_path' from source: unknown 8454 1726882411.24676: variable 'ansible_search_path' from source: unknown 8454 1726882411.24735: calling self._execute() 8454 1726882411.24839: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882411.24857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882411.24874: variable 'omit' from source: magic vars 8454 1726882411.25362: variable 'ansible_distribution_major_version' from source: facts 8454 1726882411.25393: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882411.25405: variable 'omit' from source: magic vars 8454 1726882411.25476: variable 'omit' from source: magic vars 8454 1726882411.26019: variable 'dhcp_interface1' from source: play vars 8454 1726882411.26024: variable 'dhcp_interface2' from source: play vars 8454 1726882411.26076: variable 'omit' from source: magic vars 8454 1726882411.26113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882411.26148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882411.26164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882411.26180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882411.26194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882411.26222: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882411.26225: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882411.26230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882411.26323: Set connection var ansible_connection to ssh 8454 1726882411.26331: Set connection var ansible_shell_executable to /bin/sh 8454 1726882411.26341: Set connection var ansible_timeout to 10 8454 1726882411.26344: Set connection var ansible_shell_type to sh 8454 1726882411.26352: Set connection var ansible_pipelining to False 8454 1726882411.26359: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882411.26378: variable 'ansible_shell_executable' from source: unknown 8454 1726882411.26384: variable 'ansible_connection' from source: unknown 8454 1726882411.26387: variable 'ansible_module_compression' from source: unknown 8454 1726882411.26392: variable 'ansible_shell_type' from source: unknown 8454 1726882411.26395: variable 'ansible_shell_executable' from source: unknown 8454 1726882411.26399: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882411.26404: variable 'ansible_pipelining' from source: unknown 8454 1726882411.26407: variable 'ansible_timeout' from source: unknown 8454 1726882411.26413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882411.26528: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882411.26541: variable 'omit' from source: magic vars 8454 1726882411.26544: starting attempt loop 8454 1726882411.26548: running the handler 8454 1726882411.26561: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882411.26575: _low_level_execute_command(): starting 8454 1726882411.26587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882411.27100: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882411.27104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.27107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.27159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882411.27163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882411.27281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882411.29027: stdout chunk (state=3): >>>/root <<< 8454 1726882411.29137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882411.29186: stderr chunk (state=3): >>><<< 8454 1726882411.29190: stdout chunk (state=3): >>><<< 8454 1726882411.29207: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882411.29219: _low_level_execute_command(): starting 8454 1726882411.29224: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983 `" && echo ansible-tmp-1726882411.2920744-8811-263424694391983="` echo /root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983 `" ) && sleep 0' 8454 1726882411.29646: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882411.29654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.29673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.29727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882411.29735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882411.29850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882411.31924: stdout chunk (state=3): >>>ansible-tmp-1726882411.2920744-8811-263424694391983=/root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983 <<< 8454 1726882411.32200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882411.32204: stdout chunk (state=3): >>><<< 8454 1726882411.32206: stderr chunk (state=3): >>><<< 8454 1726882411.32209: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882411.2920744-8811-263424694391983=/root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882411.32211: variable 'ansible_module_compression' from source: unknown 8454 1726882411.32249: ANSIBALLZ: Using generic lock for ansible.legacy.command 8454 1726882411.32257: ANSIBALLZ: Acquiring lock 8454 1726882411.32264: ANSIBALLZ: Lock acquired: 140055527345136 8454 1726882411.32272: ANSIBALLZ: Creating module 8454 1726882411.48844: ANSIBALLZ: Writing module into payload 8454 1726882411.48961: ANSIBALLZ: Writing module 8454 1726882411.49050: ANSIBALLZ: Renaming module 8454 1726882411.49053: ANSIBALLZ: Done creating module 8454 1726882411.49055: variable 'ansible_facts' from source: unknown 8454 1726882411.49083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/AnsiballZ_command.py 8454 1726882411.49315: Sending initial data 8454 1726882411.49318: Sent initial data (154 bytes) 8454 1726882411.49900: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882411.49903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882411.49905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.49908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882411.49910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882411.49913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.49968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882411.49973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882411.50101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882411.51898: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882411.52153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882411.52289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp5a63laqv /root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/AnsiballZ_command.py <<< 8454 1726882411.52293: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/AnsiballZ_command.py" <<< 8454 1726882411.52416: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp5a63laqv" to remote "/root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/AnsiballZ_command.py" <<< 8454 1726882411.54162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882411.54225: stderr chunk (state=3): >>><<< 8454 1726882411.54238: stdout chunk (state=3): >>><<< 8454 1726882411.54270: done transferring module to remote 8454 1726882411.54312: _low_level_execute_command(): starting 8454 1726882411.54325: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/ /root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/AnsiballZ_command.py && sleep 0' 8454 1726882411.55160: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.55196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882411.55221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882411.55244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882411.55515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882411.57338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882411.57385: stderr chunk (state=3): >>><<< 8454 1726882411.57388: stdout chunk (state=3): >>><<< 8454 1726882411.57400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882411.57403: _low_level_execute_command(): starting 8454 1726882411.57409: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/AnsiballZ_command.py && sleep 0' 8454 1726882411.57811: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882411.57848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882411.57852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882411.57854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.57856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882411.57859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882411.57918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882411.57920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882411.58038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.01512: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 653 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 653 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! <<< 8454 1726882413.01539: stdout chunk (state=3): >>>firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:31.752193", "end": "2024-09-20 21:33:33.012443", "delta": "0:00:01.260250", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882413.03273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.03311: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 8454 1726882413.03315: stdout chunk (state=3): >>><<< 8454 1726882413.03317: stderr chunk (state=3): >>><<< 8454 1726882413.03350: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 653 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 653 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:31.752193", "end": "2024-09-20 21:33:33.012443", "delta": "0:00:01.260250", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882413.03442: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882413.03458: _low_level_execute_command(): starting 8454 1726882413.03469: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882411.2920744-8811-263424694391983/ > /dev/null 2>&1 && sleep 0' 8454 1726882413.04095: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882413.04112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882413.04129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.04153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882413.04208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.04279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.04310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.04455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.06539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.06542: stdout chunk (state=3): >>><<< 8454 1726882413.06740: stderr chunk (state=3): >>><<< 8454 1726882413.06743: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882413.06746: handler run complete 8454 1726882413.06748: Evaluated conditional (False): False 8454 1726882413.06750: attempt loop complete, returning result 8454 1726882413.06751: _execute() done 8454 1726882413.06753: dumping result to json 8454 1726882413.06755: done dumping result, returning 8454 1726882413.06757: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0affe814-3a2d-f59f-16b9-000000000012] 8454 1726882413.06758: sending task result for task 0affe814-3a2d-f59f-16b9-000000000012 8454 1726882413.06835: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000012 8454 1726882413.06841: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.260250", "end": "2024-09-20 21:33:33.012443", "rc": 0, "start": "2024-09-20 21:33:31.752193" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 653 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 653 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 8454 1726882413.06951: no more pending results, returning what we have 8454 1726882413.06955: results queue empty 8454 1726882413.06957: checking for any_errors_fatal 8454 1726882413.06966: done checking for any_errors_fatal 8454 1726882413.06967: checking for max_fail_percentage 8454 1726882413.06969: done checking for max_fail_percentage 8454 1726882413.06970: checking to see if all hosts have failed and the running result is not ok 8454 1726882413.06971: done checking to see if all hosts have failed 8454 1726882413.06972: getting the remaining hosts for this loop 8454 1726882413.06974: done getting the remaining hosts for this loop 8454 1726882413.06982: getting the next task for host managed_node3 8454 1726882413.06991: done getting next task for host managed_node3 8454 1726882413.06995: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8454 1726882413.06999: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882413.07003: getting variables 8454 1726882413.07005: in VariableManager get_vars() 8454 1726882413.07267: Calling all_inventory to load vars for managed_node3 8454 1726882413.07271: Calling groups_inventory to load vars for managed_node3 8454 1726882413.07274: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882413.07289: Calling all_plugins_play to load vars for managed_node3 8454 1726882413.07292: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882413.07296: Calling groups_plugins_play to load vars for managed_node3 8454 1726882413.07594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882413.07904: done with get_vars() 8454 1726882413.07917: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:33 -0400 (0:00:01.841) 0:00:11.098 ****** 8454 1726882413.08032: entering _queue_task() for managed_node3/include_tasks 8454 1726882413.08312: worker is 1 (out of 1 available) 8454 1726882413.08325: exiting _queue_task() for managed_node3/include_tasks 8454 1726882413.08455: done queuing things up, now waiting for results queue to drain 8454 1726882413.08457: waiting for pending results... 8454 1726882413.08621: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 8454 1726882413.08781: in run() - task 0affe814-3a2d-f59f-16b9-000000000016 8454 1726882413.08810: variable 'ansible_search_path' from source: unknown 8454 1726882413.08819: variable 'ansible_search_path' from source: unknown 8454 1726882413.08867: calling self._execute() 8454 1726882413.08974: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.08998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.09021: variable 'omit' from source: magic vars 8454 1726882413.09640: variable 'ansible_distribution_major_version' from source: facts 8454 1726882413.09644: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882413.09655: _execute() done 8454 1726882413.09658: dumping result to json 8454 1726882413.09661: done dumping result, returning 8454 1726882413.09664: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-f59f-16b9-000000000016] 8454 1726882413.09666: sending task result for task 0affe814-3a2d-f59f-16b9-000000000016 8454 1726882413.09925: no more pending results, returning what we have 8454 1726882413.09930: in VariableManager get_vars() 8454 1726882413.09983: Calling all_inventory to load vars for managed_node3 8454 1726882413.09986: Calling groups_inventory to load vars for managed_node3 8454 1726882413.09990: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882413.09996: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000016 8454 1726882413.09999: WORKER PROCESS EXITING 8454 1726882413.10009: Calling all_plugins_play to load vars for managed_node3 8454 1726882413.10013: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882413.10018: Calling groups_plugins_play to load vars for managed_node3 8454 1726882413.10303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882413.10585: done with get_vars() 8454 1726882413.10593: variable 'ansible_search_path' from source: unknown 8454 1726882413.10595: variable 'ansible_search_path' from source: unknown 8454 1726882413.10652: we have included files to process 8454 1726882413.10653: generating all_blocks data 8454 1726882413.10655: done generating all_blocks data 8454 1726882413.10656: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882413.10658: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882413.10660: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882413.10957: done processing included file 8454 1726882413.10960: iterating over new_blocks loaded from include file 8454 1726882413.10962: in VariableManager get_vars() 8454 1726882413.10988: done with get_vars() 8454 1726882413.10990: filtering new block on tags 8454 1726882413.11012: done filtering new block on tags 8454 1726882413.11015: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 8454 1726882413.11020: extending task lists for all hosts with included blocks 8454 1726882413.11169: done extending task lists 8454 1726882413.11175: done processing included files 8454 1726882413.11176: results queue empty 8454 1726882413.11177: checking for any_errors_fatal 8454 1726882413.11186: done checking for any_errors_fatal 8454 1726882413.11187: checking for max_fail_percentage 8454 1726882413.11189: done checking for max_fail_percentage 8454 1726882413.11190: checking to see if all hosts have failed and the running result is not ok 8454 1726882413.11191: done checking to see if all hosts have failed 8454 1726882413.11192: getting the remaining hosts for this loop 8454 1726882413.11193: done getting the remaining hosts for this loop 8454 1726882413.11196: getting the next task for host managed_node3 8454 1726882413.11201: done getting next task for host managed_node3 8454 1726882413.11204: ^ task is: TASK: Get stat for interface {{ interface }} 8454 1726882413.11207: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882413.11210: getting variables 8454 1726882413.11211: in VariableManager get_vars() 8454 1726882413.11226: Calling all_inventory to load vars for managed_node3 8454 1726882413.11229: Calling groups_inventory to load vars for managed_node3 8454 1726882413.11232: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882413.11241: Calling all_plugins_play to load vars for managed_node3 8454 1726882413.11244: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882413.11249: Calling groups_plugins_play to load vars for managed_node3 8454 1726882413.11452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882413.11769: done with get_vars() 8454 1726882413.11782: done getting variables 8454 1726882413.11981: variable 'interface' from source: task vars 8454 1726882413.11987: variable 'dhcp_interface1' from source: play vars 8454 1726882413.12071: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:33 -0400 (0:00:00.040) 0:00:11.138 ****** 8454 1726882413.12123: entering _queue_task() for managed_node3/stat 8454 1726882413.12545: worker is 1 (out of 1 available) 8454 1726882413.12556: exiting _queue_task() for managed_node3/stat 8454 1726882413.12568: done queuing things up, now waiting for results queue to drain 8454 1726882413.12570: waiting for pending results... 8454 1726882413.12696: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 8454 1726882413.12808: in run() - task 0affe814-3a2d-f59f-16b9-000000000152 8454 1726882413.12820: variable 'ansible_search_path' from source: unknown 8454 1726882413.12823: variable 'ansible_search_path' from source: unknown 8454 1726882413.12863: calling self._execute() 8454 1726882413.12931: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.12938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.12951: variable 'omit' from source: magic vars 8454 1726882413.13237: variable 'ansible_distribution_major_version' from source: facts 8454 1726882413.13248: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882413.13254: variable 'omit' from source: magic vars 8454 1726882413.13302: variable 'omit' from source: magic vars 8454 1726882413.13382: variable 'interface' from source: task vars 8454 1726882413.13391: variable 'dhcp_interface1' from source: play vars 8454 1726882413.13443: variable 'dhcp_interface1' from source: play vars 8454 1726882413.13462: variable 'omit' from source: magic vars 8454 1726882413.13503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882413.13533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882413.13553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882413.13571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882413.13585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882413.13614: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882413.13618: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.13621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.13708: Set connection var ansible_connection to ssh 8454 1726882413.13718: Set connection var ansible_shell_executable to /bin/sh 8454 1726882413.13725: Set connection var ansible_timeout to 10 8454 1726882413.13728: Set connection var ansible_shell_type to sh 8454 1726882413.13737: Set connection var ansible_pipelining to False 8454 1726882413.13744: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882413.13762: variable 'ansible_shell_executable' from source: unknown 8454 1726882413.13765: variable 'ansible_connection' from source: unknown 8454 1726882413.13769: variable 'ansible_module_compression' from source: unknown 8454 1726882413.13773: variable 'ansible_shell_type' from source: unknown 8454 1726882413.13776: variable 'ansible_shell_executable' from source: unknown 8454 1726882413.13779: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.13788: variable 'ansible_pipelining' from source: unknown 8454 1726882413.13791: variable 'ansible_timeout' from source: unknown 8454 1726882413.13796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.13960: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882413.13969: variable 'omit' from source: magic vars 8454 1726882413.13975: starting attempt loop 8454 1726882413.13978: running the handler 8454 1726882413.13994: _low_level_execute_command(): starting 8454 1726882413.14006: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882413.14550: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.14554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.14557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.14640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.14751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.16527: stdout chunk (state=3): >>>/root <<< 8454 1726882413.16649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.16684: stderr chunk (state=3): >>><<< 8454 1726882413.16687: stdout chunk (state=3): >>><<< 8454 1726882413.16704: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882413.16715: _low_level_execute_command(): starting 8454 1726882413.16721: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405 `" && echo ansible-tmp-1726882413.1670382-8874-230134866087405="` echo /root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405 `" ) && sleep 0' 8454 1726882413.17151: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.17154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.17157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.17166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.17218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.17224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.17341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.19398: stdout chunk (state=3): >>>ansible-tmp-1726882413.1670382-8874-230134866087405=/root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405 <<< 8454 1726882413.19521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.19563: stderr chunk (state=3): >>><<< 8454 1726882413.19567: stdout chunk (state=3): >>><<< 8454 1726882413.19587: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882413.1670382-8874-230134866087405=/root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882413.19622: variable 'ansible_module_compression' from source: unknown 8454 1726882413.19682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8454 1726882413.19710: variable 'ansible_facts' from source: unknown 8454 1726882413.19790: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/AnsiballZ_stat.py 8454 1726882413.19910: Sending initial data 8454 1726882413.19920: Sent initial data (151 bytes) 8454 1726882413.20525: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882413.20547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.20565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.20702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.22365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8454 1726882413.22375: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882413.22474: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882413.22586: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp0e53k7un /root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/AnsiballZ_stat.py <<< 8454 1726882413.22590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/AnsiballZ_stat.py" <<< 8454 1726882413.22696: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp0e53k7un" to remote "/root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/AnsiballZ_stat.py" <<< 8454 1726882413.23757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.23816: stderr chunk (state=3): >>><<< 8454 1726882413.23820: stdout chunk (state=3): >>><<< 8454 1726882413.23935: done transferring module to remote 8454 1726882413.23939: _low_level_execute_command(): starting 8454 1726882413.23942: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/ /root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/AnsiballZ_stat.py && sleep 0' 8454 1726882413.24498: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.24519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.24666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.26596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.26603: stdout chunk (state=3): >>><<< 8454 1726882413.26617: stderr chunk (state=3): >>><<< 8454 1726882413.26630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882413.26654: _low_level_execute_command(): starting 8454 1726882413.26659: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/AnsiballZ_stat.py && sleep 0' 8454 1726882413.27068: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.27071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.27073: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882413.27076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.27132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882413.27141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.27262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.44619: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34738, "dev": 23, "nlink": 1, "atime": 1726882411.7594666, "mtime": 1726882411.7594666, "ctime": 1726882411.7594666, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8454 1726882413.45987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882413.46047: stderr chunk (state=3): >>><<< 8454 1726882413.46051: stdout chunk (state=3): >>><<< 8454 1726882413.46069: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34738, "dev": 23, "nlink": 1, "atime": 1726882411.7594666, "mtime": 1726882411.7594666, "ctime": 1726882411.7594666, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882413.46122: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882413.46131: _low_level_execute_command(): starting 8454 1726882413.46140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882413.1670382-8874-230134866087405/ > /dev/null 2>&1 && sleep 0' 8454 1726882413.46773: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.46781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.46925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.48890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.48939: stderr chunk (state=3): >>><<< 8454 1726882413.48943: stdout chunk (state=3): >>><<< 8454 1726882413.48957: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882413.48963: handler run complete 8454 1726882413.49009: attempt loop complete, returning result 8454 1726882413.49012: _execute() done 8454 1726882413.49015: dumping result to json 8454 1726882413.49024: done dumping result, returning 8454 1726882413.49037: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0affe814-3a2d-f59f-16b9-000000000152] 8454 1726882413.49043: sending task result for task 0affe814-3a2d-f59f-16b9-000000000152 8454 1726882413.49157: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000152 8454 1726882413.49160: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882411.7594666, "block_size": 4096, "blocks": 0, "ctime": 1726882411.7594666, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34738, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882411.7594666, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8454 1726882413.49287: no more pending results, returning what we have 8454 1726882413.49292: results queue empty 8454 1726882413.49293: checking for any_errors_fatal 8454 1726882413.49295: done checking for any_errors_fatal 8454 1726882413.49296: checking for max_fail_percentage 8454 1726882413.49297: done checking for max_fail_percentage 8454 1726882413.49303: checking to see if all hosts have failed and the running result is not ok 8454 1726882413.49304: done checking to see if all hosts have failed 8454 1726882413.49305: getting the remaining hosts for this loop 8454 1726882413.49307: done getting the remaining hosts for this loop 8454 1726882413.49311: getting the next task for host managed_node3 8454 1726882413.49319: done getting next task for host managed_node3 8454 1726882413.49321: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8454 1726882413.49324: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882413.49328: getting variables 8454 1726882413.49329: in VariableManager get_vars() 8454 1726882413.49457: Calling all_inventory to load vars for managed_node3 8454 1726882413.49461: Calling groups_inventory to load vars for managed_node3 8454 1726882413.49464: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882413.49476: Calling all_plugins_play to load vars for managed_node3 8454 1726882413.49479: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882413.49484: Calling groups_plugins_play to load vars for managed_node3 8454 1726882413.49709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882413.50007: done with get_vars() 8454 1726882413.50020: done getting variables 8454 1726882413.50141: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 8454 1726882413.50277: variable 'interface' from source: task vars 8454 1726882413.50281: variable 'dhcp_interface1' from source: play vars 8454 1726882413.50361: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:33 -0400 (0:00:00.382) 0:00:11.521 ****** 8454 1726882413.50402: entering _queue_task() for managed_node3/assert 8454 1726882413.50405: Creating lock for assert 8454 1726882413.50671: worker is 1 (out of 1 available) 8454 1726882413.50682: exiting _queue_task() for managed_node3/assert 8454 1726882413.50695: done queuing things up, now waiting for results queue to drain 8454 1726882413.50697: waiting for pending results... 8454 1726882413.51066: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 8454 1726882413.51103: in run() - task 0affe814-3a2d-f59f-16b9-000000000017 8454 1726882413.51123: variable 'ansible_search_path' from source: unknown 8454 1726882413.51131: variable 'ansible_search_path' from source: unknown 8454 1726882413.51185: calling self._execute() 8454 1726882413.51280: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.51295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.51310: variable 'omit' from source: magic vars 8454 1726882413.51823: variable 'ansible_distribution_major_version' from source: facts 8454 1726882413.51920: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882413.51925: variable 'omit' from source: magic vars 8454 1726882413.51929: variable 'omit' from source: magic vars 8454 1726882413.52048: variable 'interface' from source: task vars 8454 1726882413.52063: variable 'dhcp_interface1' from source: play vars 8454 1726882413.52149: variable 'dhcp_interface1' from source: play vars 8454 1726882413.52261: variable 'omit' from source: magic vars 8454 1726882413.52264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882413.52276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882413.52302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882413.52328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882413.52351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882413.52399: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882413.52408: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.52416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.52549: Set connection var ansible_connection to ssh 8454 1726882413.52568: Set connection var ansible_shell_executable to /bin/sh 8454 1726882413.52579: Set connection var ansible_timeout to 10 8454 1726882413.52599: Set connection var ansible_shell_type to sh 8454 1726882413.52612: Set connection var ansible_pipelining to False 8454 1726882413.52697: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882413.52700: variable 'ansible_shell_executable' from source: unknown 8454 1726882413.52704: variable 'ansible_connection' from source: unknown 8454 1726882413.52706: variable 'ansible_module_compression' from source: unknown 8454 1726882413.52708: variable 'ansible_shell_type' from source: unknown 8454 1726882413.52710: variable 'ansible_shell_executable' from source: unknown 8454 1726882413.52712: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.52714: variable 'ansible_pipelining' from source: unknown 8454 1726882413.52716: variable 'ansible_timeout' from source: unknown 8454 1726882413.52718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.52877: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882413.52914: variable 'omit' from source: magic vars 8454 1726882413.52917: starting attempt loop 8454 1726882413.52919: running the handler 8454 1726882413.53093: variable 'interface_stat' from source: set_fact 8454 1726882413.53131: Evaluated conditional (interface_stat.stat.exists): True 8454 1726882413.53158: handler run complete 8454 1726882413.53167: attempt loop complete, returning result 8454 1726882413.53174: _execute() done 8454 1726882413.53182: dumping result to json 8454 1726882413.53241: done dumping result, returning 8454 1726882413.53244: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0affe814-3a2d-f59f-16b9-000000000017] 8454 1726882413.53247: sending task result for task 0affe814-3a2d-f59f-16b9-000000000017 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882413.53568: no more pending results, returning what we have 8454 1726882413.53572: results queue empty 8454 1726882413.53573: checking for any_errors_fatal 8454 1726882413.53580: done checking for any_errors_fatal 8454 1726882413.53582: checking for max_fail_percentage 8454 1726882413.53584: done checking for max_fail_percentage 8454 1726882413.53585: checking to see if all hosts have failed and the running result is not ok 8454 1726882413.53586: done checking to see if all hosts have failed 8454 1726882413.53587: getting the remaining hosts for this loop 8454 1726882413.53588: done getting the remaining hosts for this loop 8454 1726882413.53592: getting the next task for host managed_node3 8454 1726882413.53601: done getting next task for host managed_node3 8454 1726882413.53604: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8454 1726882413.53607: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882413.53610: getting variables 8454 1726882413.53612: in VariableManager get_vars() 8454 1726882413.53649: Calling all_inventory to load vars for managed_node3 8454 1726882413.53652: Calling groups_inventory to load vars for managed_node3 8454 1726882413.53711: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882413.53721: Calling all_plugins_play to load vars for managed_node3 8454 1726882413.53725: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882413.53729: Calling groups_plugins_play to load vars for managed_node3 8454 1726882413.53746: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000017 8454 1726882413.53749: WORKER PROCESS EXITING 8454 1726882413.53962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882413.54254: done with get_vars() 8454 1726882413.54266: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:33 -0400 (0:00:00.039) 0:00:11.561 ****** 8454 1726882413.54372: entering _queue_task() for managed_node3/include_tasks 8454 1726882413.54726: worker is 1 (out of 1 available) 8454 1726882413.54740: exiting _queue_task() for managed_node3/include_tasks 8454 1726882413.54752: done queuing things up, now waiting for results queue to drain 8454 1726882413.54754: waiting for pending results... 8454 1726882413.54917: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 8454 1726882413.55061: in run() - task 0affe814-3a2d-f59f-16b9-00000000001b 8454 1726882413.55083: variable 'ansible_search_path' from source: unknown 8454 1726882413.55094: variable 'ansible_search_path' from source: unknown 8454 1726882413.55136: calling self._execute() 8454 1726882413.55233: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.55249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.55272: variable 'omit' from source: magic vars 8454 1726882413.55695: variable 'ansible_distribution_major_version' from source: facts 8454 1726882413.55719: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882413.55730: _execute() done 8454 1726882413.55745: dumping result to json 8454 1726882413.55756: done dumping result, returning 8454 1726882413.55767: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-f59f-16b9-00000000001b] 8454 1726882413.55780: sending task result for task 0affe814-3a2d-f59f-16b9-00000000001b 8454 1726882413.55989: no more pending results, returning what we have 8454 1726882413.55995: in VariableManager get_vars() 8454 1726882413.56055: Calling all_inventory to load vars for managed_node3 8454 1726882413.56059: Calling groups_inventory to load vars for managed_node3 8454 1726882413.56062: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882413.56078: Calling all_plugins_play to load vars for managed_node3 8454 1726882413.56082: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882413.56086: Calling groups_plugins_play to load vars for managed_node3 8454 1726882413.56419: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000001b 8454 1726882413.56423: WORKER PROCESS EXITING 8454 1726882413.56457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882413.56763: done with get_vars() 8454 1726882413.56771: variable 'ansible_search_path' from source: unknown 8454 1726882413.56772: variable 'ansible_search_path' from source: unknown 8454 1726882413.56818: we have included files to process 8454 1726882413.56820: generating all_blocks data 8454 1726882413.56821: done generating all_blocks data 8454 1726882413.56826: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882413.56827: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882413.56830: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882413.57065: done processing included file 8454 1726882413.57068: iterating over new_blocks loaded from include file 8454 1726882413.57070: in VariableManager get_vars() 8454 1726882413.57093: done with get_vars() 8454 1726882413.57095: filtering new block on tags 8454 1726882413.57122: done filtering new block on tags 8454 1726882413.57125: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 8454 1726882413.57130: extending task lists for all hosts with included blocks 8454 1726882413.57277: done extending task lists 8454 1726882413.57279: done processing included files 8454 1726882413.57279: results queue empty 8454 1726882413.57280: checking for any_errors_fatal 8454 1726882413.57283: done checking for any_errors_fatal 8454 1726882413.57284: checking for max_fail_percentage 8454 1726882413.57286: done checking for max_fail_percentage 8454 1726882413.57287: checking to see if all hosts have failed and the running result is not ok 8454 1726882413.57289: done checking to see if all hosts have failed 8454 1726882413.57289: getting the remaining hosts for this loop 8454 1726882413.57291: done getting the remaining hosts for this loop 8454 1726882413.57294: getting the next task for host managed_node3 8454 1726882413.57299: done getting next task for host managed_node3 8454 1726882413.57301: ^ task is: TASK: Get stat for interface {{ interface }} 8454 1726882413.57305: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882413.57308: getting variables 8454 1726882413.57309: in VariableManager get_vars() 8454 1726882413.57329: Calling all_inventory to load vars for managed_node3 8454 1726882413.57332: Calling groups_inventory to load vars for managed_node3 8454 1726882413.57335: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882413.57343: Calling all_plugins_play to load vars for managed_node3 8454 1726882413.57347: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882413.57350: Calling groups_plugins_play to load vars for managed_node3 8454 1726882413.57542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882413.57822: done with get_vars() 8454 1726882413.57833: done getting variables 8454 1726882413.58018: variable 'interface' from source: task vars 8454 1726882413.58023: variable 'dhcp_interface2' from source: play vars 8454 1726882413.58102: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:33 -0400 (0:00:00.037) 0:00:11.599 ****** 8454 1726882413.58142: entering _queue_task() for managed_node3/stat 8454 1726882413.58399: worker is 1 (out of 1 available) 8454 1726882413.58412: exiting _queue_task() for managed_node3/stat 8454 1726882413.58542: done queuing things up, now waiting for results queue to drain 8454 1726882413.58544: waiting for pending results... 8454 1726882413.58716: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 8454 1726882413.58861: in run() - task 0affe814-3a2d-f59f-16b9-00000000016a 8454 1726882413.58889: variable 'ansible_search_path' from source: unknown 8454 1726882413.58898: variable 'ansible_search_path' from source: unknown 8454 1726882413.58944: calling self._execute() 8454 1726882413.59042: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.59078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.59082: variable 'omit' from source: magic vars 8454 1726882413.59481: variable 'ansible_distribution_major_version' from source: facts 8454 1726882413.59500: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882413.59623: variable 'omit' from source: magic vars 8454 1726882413.59627: variable 'omit' from source: magic vars 8454 1726882413.59715: variable 'interface' from source: task vars 8454 1726882413.59731: variable 'dhcp_interface2' from source: play vars 8454 1726882413.59814: variable 'dhcp_interface2' from source: play vars 8454 1726882413.59847: variable 'omit' from source: magic vars 8454 1726882413.59898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882413.59946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882413.59979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882413.60328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882413.60348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882413.60400: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882413.60438: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.60441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.60551: Set connection var ansible_connection to ssh 8454 1726882413.60568: Set connection var ansible_shell_executable to /bin/sh 8454 1726882413.60580: Set connection var ansible_timeout to 10 8454 1726882413.60590: Set connection var ansible_shell_type to sh 8454 1726882413.60615: Set connection var ansible_pipelining to False 8454 1726882413.60638: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882413.60658: variable 'ansible_shell_executable' from source: unknown 8454 1726882413.60715: variable 'ansible_connection' from source: unknown 8454 1726882413.60720: variable 'ansible_module_compression' from source: unknown 8454 1726882413.60722: variable 'ansible_shell_type' from source: unknown 8454 1726882413.60725: variable 'ansible_shell_executable' from source: unknown 8454 1726882413.60727: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882413.60729: variable 'ansible_pipelining' from source: unknown 8454 1726882413.60732: variable 'ansible_timeout' from source: unknown 8454 1726882413.60735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882413.61042: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882413.61047: variable 'omit' from source: magic vars 8454 1726882413.61050: starting attempt loop 8454 1726882413.61052: running the handler 8454 1726882413.61054: _low_level_execute_command(): starting 8454 1726882413.61056: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882413.62483: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882413.62663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.62667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.62800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.62877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.64730: stdout chunk (state=3): >>>/root <<< 8454 1726882413.64918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.64921: stdout chunk (state=3): >>><<< 8454 1726882413.64924: stderr chunk (state=3): >>><<< 8454 1726882413.65142: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882413.65146: _low_level_execute_command(): starting 8454 1726882413.65149: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042 `" && echo ansible-tmp-1726882413.6498342-8891-17640432987042="` echo /root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042 `" ) && sleep 0' 8454 1726882413.66440: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882413.66444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882413.66447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.66451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882413.66454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882413.66463: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882413.66465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.66541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882413.66544: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882413.66547: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8454 1726882413.66739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.66750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.66781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.68869: stdout chunk (state=3): >>>ansible-tmp-1726882413.6498342-8891-17640432987042=/root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042 <<< 8454 1726882413.69053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.69074: stderr chunk (state=3): >>><<< 8454 1726882413.69086: stdout chunk (state=3): >>><<< 8454 1726882413.69114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882413.6498342-8891-17640432987042=/root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882413.69176: variable 'ansible_module_compression' from source: unknown 8454 1726882413.69244: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8454 1726882413.69292: variable 'ansible_facts' from source: unknown 8454 1726882413.69399: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/AnsiballZ_stat.py 8454 1726882413.69630: Sending initial data 8454 1726882413.69644: Sent initial data (150 bytes) 8454 1726882413.70214: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882413.70230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.70245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.70386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.72068: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8454 1726882413.72087: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 8454 1726882413.72119: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 8454 1726882413.72140: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 8454 1726882413.72165: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882413.72278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882413.72412: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpsx7cdqt_ /root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/AnsiballZ_stat.py <<< 8454 1726882413.72417: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/AnsiballZ_stat.py" <<< 8454 1726882413.72519: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpsx7cdqt_" to remote "/root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/AnsiballZ_stat.py" <<< 8454 1726882413.74224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.74251: stderr chunk (state=3): >>><<< 8454 1726882413.74254: stdout chunk (state=3): >>><<< 8454 1726882413.74377: done transferring module to remote 8454 1726882413.74384: _low_level_execute_command(): starting 8454 1726882413.74387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/ /root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/AnsiballZ_stat.py && sleep 0' 8454 1726882413.74956: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882413.74974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882413.74994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.75051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882413.75132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882413.75153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.75173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.75331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.77347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882413.77353: stdout chunk (state=3): >>><<< 8454 1726882413.77361: stderr chunk (state=3): >>><<< 8454 1726882413.77385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882413.77388: _low_level_execute_command(): starting 8454 1726882413.77395: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/AnsiballZ_stat.py && sleep 0' 8454 1726882413.78074: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882413.78084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882413.78096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.78216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882413.78262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.78389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882413.95670: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35144, "dev": 23, "nlink": 1, "atime": 1726882411.7651284, "mtime": 1726882411.7651284, "ctime": 1726882411.7651284, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8454 1726882413.97413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882413.97418: stdout chunk (state=3): >>><<< 8454 1726882413.97421: stderr chunk (state=3): >>><<< 8454 1726882413.97424: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35144, "dev": 23, "nlink": 1, "atime": 1726882411.7651284, "mtime": 1726882411.7651284, "ctime": 1726882411.7651284, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882413.97428: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882413.97432: _low_level_execute_command(): starting 8454 1726882413.97442: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882413.6498342-8891-17640432987042/ > /dev/null 2>&1 && sleep 0' 8454 1726882413.97991: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882413.98000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882413.98015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882413.98033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882413.98049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882413.98151: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882413.98169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882413.98316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882414.00429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882414.00433: stdout chunk (state=3): >>><<< 8454 1726882414.00437: stderr chunk (state=3): >>><<< 8454 1726882414.00647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882414.00651: handler run complete 8454 1726882414.00653: attempt loop complete, returning result 8454 1726882414.00656: _execute() done 8454 1726882414.00658: dumping result to json 8454 1726882414.00661: done dumping result, returning 8454 1726882414.00663: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0affe814-3a2d-f59f-16b9-00000000016a] 8454 1726882414.00666: sending task result for task 0affe814-3a2d-f59f-16b9-00000000016a 8454 1726882414.00755: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000016a 8454 1726882414.00759: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882411.7651284, "block_size": 4096, "blocks": 0, "ctime": 1726882411.7651284, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35144, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882411.7651284, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8454 1726882414.00890: no more pending results, returning what we have 8454 1726882414.00895: results queue empty 8454 1726882414.00896: checking for any_errors_fatal 8454 1726882414.00898: done checking for any_errors_fatal 8454 1726882414.00899: checking for max_fail_percentage 8454 1726882414.00901: done checking for max_fail_percentage 8454 1726882414.00903: checking to see if all hosts have failed and the running result is not ok 8454 1726882414.00904: done checking to see if all hosts have failed 8454 1726882414.00905: getting the remaining hosts for this loop 8454 1726882414.00907: done getting the remaining hosts for this loop 8454 1726882414.00911: getting the next task for host managed_node3 8454 1726882414.00922: done getting next task for host managed_node3 8454 1726882414.00926: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8454 1726882414.00929: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882414.01054: getting variables 8454 1726882414.01057: in VariableManager get_vars() 8454 1726882414.01106: Calling all_inventory to load vars for managed_node3 8454 1726882414.01110: Calling groups_inventory to load vars for managed_node3 8454 1726882414.01113: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.01127: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.01131: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.01138: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.01969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.02261: done with get_vars() 8454 1726882414.02278: done getting variables 8454 1726882414.02354: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882414.02497: variable 'interface' from source: task vars 8454 1726882414.02506: variable 'dhcp_interface2' from source: play vars 8454 1726882414.02582: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:34 -0400 (0:00:00.444) 0:00:12.043 ****** 8454 1726882414.02628: entering _queue_task() for managed_node3/assert 8454 1726882414.03351: worker is 1 (out of 1 available) 8454 1726882414.03362: exiting _queue_task() for managed_node3/assert 8454 1726882414.03374: done queuing things up, now waiting for results queue to drain 8454 1726882414.03375: waiting for pending results... 8454 1726882414.03618: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 8454 1726882414.03623: in run() - task 0affe814-3a2d-f59f-16b9-00000000001c 8454 1726882414.03742: variable 'ansible_search_path' from source: unknown 8454 1726882414.03746: variable 'ansible_search_path' from source: unknown 8454 1726882414.03748: calling self._execute() 8454 1726882414.03783: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.03796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.03815: variable 'omit' from source: magic vars 8454 1726882414.04227: variable 'ansible_distribution_major_version' from source: facts 8454 1726882414.04248: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882414.04261: variable 'omit' from source: magic vars 8454 1726882414.04326: variable 'omit' from source: magic vars 8454 1726882414.04456: variable 'interface' from source: task vars 8454 1726882414.04467: variable 'dhcp_interface2' from source: play vars 8454 1726882414.04555: variable 'dhcp_interface2' from source: play vars 8454 1726882414.04581: variable 'omit' from source: magic vars 8454 1726882414.04637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882414.04683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882414.04712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882414.04744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882414.04762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882414.04799: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882414.04808: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.04817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.04949: Set connection var ansible_connection to ssh 8454 1726882414.04966: Set connection var ansible_shell_executable to /bin/sh 8454 1726882414.04978: Set connection var ansible_timeout to 10 8454 1726882414.04985: Set connection var ansible_shell_type to sh 8454 1726882414.04999: Set connection var ansible_pipelining to False 8454 1726882414.05040: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882414.05044: variable 'ansible_shell_executable' from source: unknown 8454 1726882414.05051: variable 'ansible_connection' from source: unknown 8454 1726882414.05059: variable 'ansible_module_compression' from source: unknown 8454 1726882414.05065: variable 'ansible_shell_type' from source: unknown 8454 1726882414.05072: variable 'ansible_shell_executable' from source: unknown 8454 1726882414.05080: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.05089: variable 'ansible_pipelining' from source: unknown 8454 1726882414.05152: variable 'ansible_timeout' from source: unknown 8454 1726882414.05156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.05276: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882414.05296: variable 'omit' from source: magic vars 8454 1726882414.05307: starting attempt loop 8454 1726882414.05315: running the handler 8454 1726882414.05486: variable 'interface_stat' from source: set_fact 8454 1726882414.05516: Evaluated conditional (interface_stat.stat.exists): True 8454 1726882414.05526: handler run complete 8454 1726882414.05551: attempt loop complete, returning result 8454 1726882414.05558: _execute() done 8454 1726882414.05586: dumping result to json 8454 1726882414.05589: done dumping result, returning 8454 1726882414.05592: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0affe814-3a2d-f59f-16b9-00000000001c] 8454 1726882414.05597: sending task result for task 0affe814-3a2d-f59f-16b9-00000000001c ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882414.05748: no more pending results, returning what we have 8454 1726882414.05752: results queue empty 8454 1726882414.05753: checking for any_errors_fatal 8454 1726882414.05762: done checking for any_errors_fatal 8454 1726882414.05763: checking for max_fail_percentage 8454 1726882414.05765: done checking for max_fail_percentage 8454 1726882414.05766: checking to see if all hosts have failed and the running result is not ok 8454 1726882414.05767: done checking to see if all hosts have failed 8454 1726882414.05768: getting the remaining hosts for this loop 8454 1726882414.05770: done getting the remaining hosts for this loop 8454 1726882414.05774: getting the next task for host managed_node3 8454 1726882414.05784: done getting next task for host managed_node3 8454 1726882414.05787: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 8454 1726882414.05790: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882414.05793: getting variables 8454 1726882414.05795: in VariableManager get_vars() 8454 1726882414.05840: Calling all_inventory to load vars for managed_node3 8454 1726882414.05844: Calling groups_inventory to load vars for managed_node3 8454 1726882414.05847: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.05858: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.05861: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.05865: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.06067: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000001c 8454 1726882414.06070: WORKER PROCESS EXITING 8454 1726882414.06097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.06427: done with get_vars() 8454 1726882414.06440: done getting variables 8454 1726882414.06507: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Friday 20 September 2024 21:33:34 -0400 (0:00:00.039) 0:00:12.083 ****** 8454 1726882414.06539: entering _queue_task() for managed_node3/command 8454 1726882414.06788: worker is 1 (out of 1 available) 8454 1726882414.06801: exiting _queue_task() for managed_node3/command 8454 1726882414.06816: done queuing things up, now waiting for results queue to drain 8454 1726882414.06818: waiting for pending results... 8454 1726882414.07079: running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript 8454 1726882414.07192: in run() - task 0affe814-3a2d-f59f-16b9-00000000001d 8454 1726882414.07211: variable 'ansible_search_path' from source: unknown 8454 1726882414.07267: calling self._execute() 8454 1726882414.07441: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.07445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.07448: variable 'omit' from source: magic vars 8454 1726882414.07824: variable 'ansible_distribution_major_version' from source: facts 8454 1726882414.07845: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882414.08002: variable 'network_provider' from source: set_fact 8454 1726882414.08014: Evaluated conditional (network_provider == "initscripts"): False 8454 1726882414.08031: when evaluation is False, skipping this task 8454 1726882414.08041: _execute() done 8454 1726882414.08050: dumping result to json 8454 1726882414.08058: done dumping result, returning 8454 1726882414.08068: done running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript [0affe814-3a2d-f59f-16b9-00000000001d] 8454 1726882414.08078: sending task result for task 0affe814-3a2d-f59f-16b9-00000000001d skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8454 1726882414.08341: no more pending results, returning what we have 8454 1726882414.08346: results queue empty 8454 1726882414.08348: checking for any_errors_fatal 8454 1726882414.08354: done checking for any_errors_fatal 8454 1726882414.08355: checking for max_fail_percentage 8454 1726882414.08357: done checking for max_fail_percentage 8454 1726882414.08358: checking to see if all hosts have failed and the running result is not ok 8454 1726882414.08359: done checking to see if all hosts have failed 8454 1726882414.08360: getting the remaining hosts for this loop 8454 1726882414.08362: done getting the remaining hosts for this loop 8454 1726882414.08366: getting the next task for host managed_node3 8454 1726882414.08372: done getting next task for host managed_node3 8454 1726882414.08375: ^ task is: TASK: TEST Add Bond with 2 ports 8454 1726882414.08378: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882414.08381: getting variables 8454 1726882414.08382: in VariableManager get_vars() 8454 1726882414.08425: Calling all_inventory to load vars for managed_node3 8454 1726882414.08428: Calling groups_inventory to load vars for managed_node3 8454 1726882414.08431: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.08446: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.08450: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.08454: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.08761: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000001d 8454 1726882414.08764: WORKER PROCESS EXITING 8454 1726882414.08791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.09044: done with get_vars() 8454 1726882414.09056: done getting variables 8454 1726882414.09128: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Friday 20 September 2024 21:33:34 -0400 (0:00:00.026) 0:00:12.109 ****** 8454 1726882414.09163: entering _queue_task() for managed_node3/debug 8454 1726882414.09541: worker is 1 (out of 1 available) 8454 1726882414.09553: exiting _queue_task() for managed_node3/debug 8454 1726882414.09566: done queuing things up, now waiting for results queue to drain 8454 1726882414.09567: waiting for pending results... 8454 1726882414.09759: running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports 8454 1726882414.09875: in run() - task 0affe814-3a2d-f59f-16b9-00000000001e 8454 1726882414.09899: variable 'ansible_search_path' from source: unknown 8454 1726882414.09949: calling self._execute() 8454 1726882414.10056: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.10076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.10095: variable 'omit' from source: magic vars 8454 1726882414.10542: variable 'ansible_distribution_major_version' from source: facts 8454 1726882414.10616: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882414.10619: variable 'omit' from source: magic vars 8454 1726882414.10622: variable 'omit' from source: magic vars 8454 1726882414.10641: variable 'omit' from source: magic vars 8454 1726882414.10692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882414.10740: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882414.10769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882414.10801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882414.10820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882414.10886: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882414.10890: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.10893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.11018: Set connection var ansible_connection to ssh 8454 1726882414.11051: Set connection var ansible_shell_executable to /bin/sh 8454 1726882414.11054: Set connection var ansible_timeout to 10 8454 1726882414.11058: Set connection var ansible_shell_type to sh 8454 1726882414.11161: Set connection var ansible_pipelining to False 8454 1726882414.11165: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882414.11167: variable 'ansible_shell_executable' from source: unknown 8454 1726882414.11170: variable 'ansible_connection' from source: unknown 8454 1726882414.11172: variable 'ansible_module_compression' from source: unknown 8454 1726882414.11174: variable 'ansible_shell_type' from source: unknown 8454 1726882414.11177: variable 'ansible_shell_executable' from source: unknown 8454 1726882414.11179: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.11181: variable 'ansible_pipelining' from source: unknown 8454 1726882414.11183: variable 'ansible_timeout' from source: unknown 8454 1726882414.11185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.11336: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882414.11358: variable 'omit' from source: magic vars 8454 1726882414.11369: starting attempt loop 8454 1726882414.11408: running the handler 8454 1726882414.11444: handler run complete 8454 1726882414.11470: attempt loop complete, returning result 8454 1726882414.11477: _execute() done 8454 1726882414.11490: dumping result to json 8454 1726882414.11516: done dumping result, returning 8454 1726882414.11519: done running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports [0affe814-3a2d-f59f-16b9-00000000001e] 8454 1726882414.11522: sending task result for task 0affe814-3a2d-f59f-16b9-00000000001e 8454 1726882414.11740: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000001e 8454 1726882414.11744: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 8454 1726882414.11794: no more pending results, returning what we have 8454 1726882414.11797: results queue empty 8454 1726882414.11798: checking for any_errors_fatal 8454 1726882414.11805: done checking for any_errors_fatal 8454 1726882414.11806: checking for max_fail_percentage 8454 1726882414.11808: done checking for max_fail_percentage 8454 1726882414.11814: checking to see if all hosts have failed and the running result is not ok 8454 1726882414.11815: done checking to see if all hosts have failed 8454 1726882414.11816: getting the remaining hosts for this loop 8454 1726882414.11818: done getting the remaining hosts for this loop 8454 1726882414.11822: getting the next task for host managed_node3 8454 1726882414.11829: done getting next task for host managed_node3 8454 1726882414.11837: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8454 1726882414.11840: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882414.11858: getting variables 8454 1726882414.11860: in VariableManager get_vars() 8454 1726882414.11902: Calling all_inventory to load vars for managed_node3 8454 1726882414.11905: Calling groups_inventory to load vars for managed_node3 8454 1726882414.11908: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.12022: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.12032: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.12037: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.12305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.12612: done with get_vars() 8454 1726882414.12622: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:34 -0400 (0:00:00.035) 0:00:12.145 ****** 8454 1726882414.12741: entering _queue_task() for managed_node3/include_tasks 8454 1726882414.12979: worker is 1 (out of 1 available) 8454 1726882414.12993: exiting _queue_task() for managed_node3/include_tasks 8454 1726882414.13005: done queuing things up, now waiting for results queue to drain 8454 1726882414.13122: waiting for pending results... 8454 1726882414.13354: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8454 1726882414.13433: in run() - task 0affe814-3a2d-f59f-16b9-000000000026 8454 1726882414.13460: variable 'ansible_search_path' from source: unknown 8454 1726882414.13471: variable 'ansible_search_path' from source: unknown 8454 1726882414.13512: calling self._execute() 8454 1726882414.13609: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.13621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.13669: variable 'omit' from source: magic vars 8454 1726882414.14065: variable 'ansible_distribution_major_version' from source: facts 8454 1726882414.14083: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882414.14095: _execute() done 8454 1726882414.14108: dumping result to json 8454 1726882414.14120: done dumping result, returning 8454 1726882414.14214: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-f59f-16b9-000000000026] 8454 1726882414.14217: sending task result for task 0affe814-3a2d-f59f-16b9-000000000026 8454 1726882414.14287: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000026 8454 1726882414.14290: WORKER PROCESS EXITING 8454 1726882414.14339: no more pending results, returning what we have 8454 1726882414.14345: in VariableManager get_vars() 8454 1726882414.14395: Calling all_inventory to load vars for managed_node3 8454 1726882414.14398: Calling groups_inventory to load vars for managed_node3 8454 1726882414.14401: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.14416: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.14419: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.14536: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.14807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.15102: done with get_vars() 8454 1726882414.15116: variable 'ansible_search_path' from source: unknown 8454 1726882414.15118: variable 'ansible_search_path' from source: unknown 8454 1726882414.15165: we have included files to process 8454 1726882414.15167: generating all_blocks data 8454 1726882414.15169: done generating all_blocks data 8454 1726882414.15174: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8454 1726882414.15175: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8454 1726882414.15178: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8454 1726882414.16115: done processing included file 8454 1726882414.16117: iterating over new_blocks loaded from include file 8454 1726882414.16119: in VariableManager get_vars() 8454 1726882414.16152: done with get_vars() 8454 1726882414.16155: filtering new block on tags 8454 1726882414.16176: done filtering new block on tags 8454 1726882414.16179: in VariableManager get_vars() 8454 1726882414.16215: done with get_vars() 8454 1726882414.16217: filtering new block on tags 8454 1726882414.16246: done filtering new block on tags 8454 1726882414.16250: in VariableManager get_vars() 8454 1726882414.16281: done with get_vars() 8454 1726882414.16283: filtering new block on tags 8454 1726882414.16313: done filtering new block on tags 8454 1726882414.16316: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 8454 1726882414.16321: extending task lists for all hosts with included blocks 8454 1726882414.17567: done extending task lists 8454 1726882414.17569: done processing included files 8454 1726882414.17570: results queue empty 8454 1726882414.17571: checking for any_errors_fatal 8454 1726882414.17574: done checking for any_errors_fatal 8454 1726882414.17575: checking for max_fail_percentage 8454 1726882414.17576: done checking for max_fail_percentage 8454 1726882414.17577: checking to see if all hosts have failed and the running result is not ok 8454 1726882414.17578: done checking to see if all hosts have failed 8454 1726882414.17579: getting the remaining hosts for this loop 8454 1726882414.17580: done getting the remaining hosts for this loop 8454 1726882414.17583: getting the next task for host managed_node3 8454 1726882414.17589: done getting next task for host managed_node3 8454 1726882414.17592: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8454 1726882414.17595: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882414.17606: getting variables 8454 1726882414.17613: in VariableManager get_vars() 8454 1726882414.17631: Calling all_inventory to load vars for managed_node3 8454 1726882414.17636: Calling groups_inventory to load vars for managed_node3 8454 1726882414.17639: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.17645: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.17648: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.17652: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.17871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.18172: done with get_vars() 8454 1726882414.18182: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:34 -0400 (0:00:00.055) 0:00:12.200 ****** 8454 1726882414.18270: entering _queue_task() for managed_node3/setup 8454 1726882414.18518: worker is 1 (out of 1 available) 8454 1726882414.18529: exiting _queue_task() for managed_node3/setup 8454 1726882414.18546: done queuing things up, now waiting for results queue to drain 8454 1726882414.18548: waiting for pending results... 8454 1726882414.18866: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8454 1726882414.19072: in run() - task 0affe814-3a2d-f59f-16b9-000000000188 8454 1726882414.19076: variable 'ansible_search_path' from source: unknown 8454 1726882414.19079: variable 'ansible_search_path' from source: unknown 8454 1726882414.19082: calling self._execute() 8454 1726882414.19151: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.19164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.19187: variable 'omit' from source: magic vars 8454 1726882414.19606: variable 'ansible_distribution_major_version' from source: facts 8454 1726882414.19635: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882414.19918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882414.22509: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882414.22600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882414.22664: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882414.22709: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882414.22772: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882414.22848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882414.22897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882414.22991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882414.22996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882414.23020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882414.23086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882414.23122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882414.23154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882414.23207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882414.23228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882414.23441: variable '__network_required_facts' from source: role '' defaults 8454 1726882414.23457: variable 'ansible_facts' from source: unknown 8454 1726882414.23577: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8454 1726882414.23689: when evaluation is False, skipping this task 8454 1726882414.23692: _execute() done 8454 1726882414.23695: dumping result to json 8454 1726882414.23697: done dumping result, returning 8454 1726882414.23700: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-f59f-16b9-000000000188] 8454 1726882414.23702: sending task result for task 0affe814-3a2d-f59f-16b9-000000000188 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882414.23841: no more pending results, returning what we have 8454 1726882414.23845: results queue empty 8454 1726882414.23847: checking for any_errors_fatal 8454 1726882414.23848: done checking for any_errors_fatal 8454 1726882414.23849: checking for max_fail_percentage 8454 1726882414.23851: done checking for max_fail_percentage 8454 1726882414.23853: checking to see if all hosts have failed and the running result is not ok 8454 1726882414.23853: done checking to see if all hosts have failed 8454 1726882414.23854: getting the remaining hosts for this loop 8454 1726882414.23856: done getting the remaining hosts for this loop 8454 1726882414.23861: getting the next task for host managed_node3 8454 1726882414.23874: done getting next task for host managed_node3 8454 1726882414.23879: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 8454 1726882414.23883: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882414.23900: getting variables 8454 1726882414.23902: in VariableManager get_vars() 8454 1726882414.23957: Calling all_inventory to load vars for managed_node3 8454 1726882414.23960: Calling groups_inventory to load vars for managed_node3 8454 1726882414.23963: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.24153: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.24159: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.24240: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.24522: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000188 8454 1726882414.24526: WORKER PROCESS EXITING 8454 1726882414.24557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.24858: done with get_vars() 8454 1726882414.24872: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:34 -0400 (0:00:00.067) 0:00:12.267 ****** 8454 1726882414.25001: entering _queue_task() for managed_node3/stat 8454 1726882414.25380: worker is 1 (out of 1 available) 8454 1726882414.25392: exiting _queue_task() for managed_node3/stat 8454 1726882414.25404: done queuing things up, now waiting for results queue to drain 8454 1726882414.25405: waiting for pending results... 8454 1726882414.25599: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 8454 1726882414.25792: in run() - task 0affe814-3a2d-f59f-16b9-00000000018a 8454 1726882414.25815: variable 'ansible_search_path' from source: unknown 8454 1726882414.25824: variable 'ansible_search_path' from source: unknown 8454 1726882414.25878: calling self._execute() 8454 1726882414.25974: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.25985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.26004: variable 'omit' from source: magic vars 8454 1726882414.26423: variable 'ansible_distribution_major_version' from source: facts 8454 1726882414.26447: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882414.26662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882414.27055: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882414.27119: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882414.27163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882414.27215: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882414.27319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882414.27425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882414.27429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882414.27436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882414.27543: variable '__network_is_ostree' from source: set_fact 8454 1726882414.27556: Evaluated conditional (not __network_is_ostree is defined): False 8454 1726882414.27563: when evaluation is False, skipping this task 8454 1726882414.27571: _execute() done 8454 1726882414.27578: dumping result to json 8454 1726882414.27587: done dumping result, returning 8454 1726882414.27598: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-f59f-16b9-00000000018a] 8454 1726882414.27607: sending task result for task 0affe814-3a2d-f59f-16b9-00000000018a skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8454 1726882414.27800: no more pending results, returning what we have 8454 1726882414.27805: results queue empty 8454 1726882414.27806: checking for any_errors_fatal 8454 1726882414.27812: done checking for any_errors_fatal 8454 1726882414.27813: checking for max_fail_percentage 8454 1726882414.27815: done checking for max_fail_percentage 8454 1726882414.27817: checking to see if all hosts have failed and the running result is not ok 8454 1726882414.27818: done checking to see if all hosts have failed 8454 1726882414.27819: getting the remaining hosts for this loop 8454 1726882414.27821: done getting the remaining hosts for this loop 8454 1726882414.27826: getting the next task for host managed_node3 8454 1726882414.27833: done getting next task for host managed_node3 8454 1726882414.27839: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8454 1726882414.27845: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882414.27860: getting variables 8454 1726882414.27861: in VariableManager get_vars() 8454 1726882414.27906: Calling all_inventory to load vars for managed_node3 8454 1726882414.27909: Calling groups_inventory to load vars for managed_node3 8454 1726882414.27912: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.27923: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.27927: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.27931: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.28124: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000018a 8454 1726882414.28127: WORKER PROCESS EXITING 8454 1726882414.28396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.28702: done with get_vars() 8454 1726882414.28713: done getting variables 8454 1726882414.28773: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:34 -0400 (0:00:00.038) 0:00:12.305 ****** 8454 1726882414.28816: entering _queue_task() for managed_node3/set_fact 8454 1726882414.29258: worker is 1 (out of 1 available) 8454 1726882414.29266: exiting _queue_task() for managed_node3/set_fact 8454 1726882414.29277: done queuing things up, now waiting for results queue to drain 8454 1726882414.29278: waiting for pending results... 8454 1726882414.29329: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8454 1726882414.29507: in run() - task 0affe814-3a2d-f59f-16b9-00000000018b 8454 1726882414.29530: variable 'ansible_search_path' from source: unknown 8454 1726882414.29541: variable 'ansible_search_path' from source: unknown 8454 1726882414.29580: calling self._execute() 8454 1726882414.29672: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.29685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.29702: variable 'omit' from source: magic vars 8454 1726882414.30111: variable 'ansible_distribution_major_version' from source: facts 8454 1726882414.30129: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882414.30336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882414.30646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882414.30708: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882414.30755: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882414.30799: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882414.30901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882414.30951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882414.30988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882414.31024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882414.31132: variable '__network_is_ostree' from source: set_fact 8454 1726882414.31154: Evaluated conditional (not __network_is_ostree is defined): False 8454 1726882414.31162: when evaluation is False, skipping this task 8454 1726882414.31169: _execute() done 8454 1726882414.31254: dumping result to json 8454 1726882414.31258: done dumping result, returning 8454 1726882414.31261: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-f59f-16b9-00000000018b] 8454 1726882414.31263: sending task result for task 0affe814-3a2d-f59f-16b9-00000000018b 8454 1726882414.31329: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000018b 8454 1726882414.31333: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8454 1726882414.31409: no more pending results, returning what we have 8454 1726882414.31414: results queue empty 8454 1726882414.31415: checking for any_errors_fatal 8454 1726882414.31421: done checking for any_errors_fatal 8454 1726882414.31422: checking for max_fail_percentage 8454 1726882414.31424: done checking for max_fail_percentage 8454 1726882414.31425: checking to see if all hosts have failed and the running result is not ok 8454 1726882414.31426: done checking to see if all hosts have failed 8454 1726882414.31427: getting the remaining hosts for this loop 8454 1726882414.31429: done getting the remaining hosts for this loop 8454 1726882414.31433: getting the next task for host managed_node3 8454 1726882414.31446: done getting next task for host managed_node3 8454 1726882414.31450: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 8454 1726882414.31455: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882414.31474: getting variables 8454 1726882414.31476: in VariableManager get_vars() 8454 1726882414.31522: Calling all_inventory to load vars for managed_node3 8454 1726882414.31526: Calling groups_inventory to load vars for managed_node3 8454 1726882414.31529: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882414.31586: Calling all_plugins_play to load vars for managed_node3 8454 1726882414.31590: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882414.31594: Calling groups_plugins_play to load vars for managed_node3 8454 1726882414.31937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882414.32228: done with get_vars() 8454 1726882414.32246: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:34 -0400 (0:00:00.035) 0:00:12.341 ****** 8454 1726882414.32359: entering _queue_task() for managed_node3/service_facts 8454 1726882414.32362: Creating lock for service_facts 8454 1726882414.32622: worker is 1 (out of 1 available) 8454 1726882414.32636: exiting _queue_task() for managed_node3/service_facts 8454 1726882414.32652: done queuing things up, now waiting for results queue to drain 8454 1726882414.32653: waiting for pending results... 8454 1726882414.32968: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 8454 1726882414.33116: in run() - task 0affe814-3a2d-f59f-16b9-00000000018d 8454 1726882414.33140: variable 'ansible_search_path' from source: unknown 8454 1726882414.33150: variable 'ansible_search_path' from source: unknown 8454 1726882414.33199: calling self._execute() 8454 1726882414.33296: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.33391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.33396: variable 'omit' from source: magic vars 8454 1726882414.33829: variable 'ansible_distribution_major_version' from source: facts 8454 1726882414.33849: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882414.33860: variable 'omit' from source: magic vars 8454 1726882414.33968: variable 'omit' from source: magic vars 8454 1726882414.34015: variable 'omit' from source: magic vars 8454 1726882414.34069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882414.34115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882414.34150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882414.34179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882414.34199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882414.34240: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882414.34263: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.34269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.34439: Set connection var ansible_connection to ssh 8454 1726882414.34443: Set connection var ansible_shell_executable to /bin/sh 8454 1726882414.34445: Set connection var ansible_timeout to 10 8454 1726882414.34448: Set connection var ansible_shell_type to sh 8454 1726882414.34450: Set connection var ansible_pipelining to False 8454 1726882414.34456: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882414.34496: variable 'ansible_shell_executable' from source: unknown 8454 1726882414.34506: variable 'ansible_connection' from source: unknown 8454 1726882414.34516: variable 'ansible_module_compression' from source: unknown 8454 1726882414.34525: variable 'ansible_shell_type' from source: unknown 8454 1726882414.34532: variable 'ansible_shell_executable' from source: unknown 8454 1726882414.34593: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882414.34597: variable 'ansible_pipelining' from source: unknown 8454 1726882414.34599: variable 'ansible_timeout' from source: unknown 8454 1726882414.34602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882414.34795: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882414.34820: variable 'omit' from source: magic vars 8454 1726882414.34831: starting attempt loop 8454 1726882414.34841: running the handler 8454 1726882414.34862: _low_level_execute_command(): starting 8454 1726882414.34919: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882414.35544: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882414.35548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882414.35578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.35582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882414.35585: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.35651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882414.35654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882414.35779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882414.37647: stdout chunk (state=3): >>>/root <<< 8454 1726882414.37825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882414.37846: stderr chunk (state=3): >>><<< 8454 1726882414.37857: stdout chunk (state=3): >>><<< 8454 1726882414.37886: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882414.37981: _low_level_execute_command(): starting 8454 1726882414.37985: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923 `" && echo ansible-tmp-1726882414.3789287-8923-243136932533923="` echo /root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923 `" ) && sleep 0' 8454 1726882414.38407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882414.38411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.38414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882414.38422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.38473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882414.38480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882414.38596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882414.40672: stdout chunk (state=3): >>>ansible-tmp-1726882414.3789287-8923-243136932533923=/root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923 <<< 8454 1726882414.40945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882414.40949: stdout chunk (state=3): >>><<< 8454 1726882414.40951: stderr chunk (state=3): >>><<< 8454 1726882414.40954: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882414.3789287-8923-243136932533923=/root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882414.40956: variable 'ansible_module_compression' from source: unknown 8454 1726882414.40983: ANSIBALLZ: Using lock for service_facts 8454 1726882414.40992: ANSIBALLZ: Acquiring lock 8454 1726882414.41000: ANSIBALLZ: Lock acquired: 140055527470288 8454 1726882414.41009: ANSIBALLZ: Creating module 8454 1726882414.53526: ANSIBALLZ: Writing module into payload 8454 1726882414.53657: ANSIBALLZ: Writing module 8454 1726882414.53685: ANSIBALLZ: Renaming module 8454 1726882414.53700: ANSIBALLZ: Done creating module 8454 1726882414.53839: variable 'ansible_facts' from source: unknown 8454 1726882414.53843: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/AnsiballZ_service_facts.py 8454 1726882414.54027: Sending initial data 8454 1726882414.54040: Sent initial data (160 bytes) 8454 1726882414.54751: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.54777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882414.54792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882414.54931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882414.56780: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882414.56897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882414.57032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpkvday0jc /root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/AnsiballZ_service_facts.py <<< 8454 1726882414.57038: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/AnsiballZ_service_facts.py" <<< 8454 1726882414.57145: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpkvday0jc" to remote "/root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/AnsiballZ_service_facts.py" <<< 8454 1726882414.58442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882414.58501: stderr chunk (state=3): >>><<< 8454 1726882414.58557: stdout chunk (state=3): >>><<< 8454 1726882414.58560: done transferring module to remote 8454 1726882414.58563: _low_level_execute_command(): starting 8454 1726882414.58565: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/ /root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/AnsiballZ_service_facts.py && sleep 0' 8454 1726882414.58999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882414.59004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.59007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882414.59009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882414.59011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.59062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882414.59071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882414.59181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882414.61160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882414.61203: stderr chunk (state=3): >>><<< 8454 1726882414.61206: stdout chunk (state=3): >>><<< 8454 1726882414.61224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882414.61227: _low_level_execute_command(): starting 8454 1726882414.61233: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/AnsiballZ_service_facts.py && sleep 0' 8454 1726882414.61680: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882414.61683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.61687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882414.61689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882414.61740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882414.61750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882414.61870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882416.53760: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "seria<<< 8454 1726882416.53776: stdout chunk (state=3): >>>l-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service"<<< 8454 1726882416.53807: stdout chunk (state=3): >>>, "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactiv<<< 8454 1726882416.53817: stdout chunk (state=3): >>>e", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": <<< 8454 1726882416.53845: stdout chunk (state=3): >>>"systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reb<<< 8454 1726882416.53852: stdout chunk (state=3): >>>oot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 8454 1726882416.55486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882416.55549: stderr chunk (state=3): >>><<< 8454 1726882416.55553: stdout chunk (state=3): >>><<< 8454 1726882416.55573: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882416.57721: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882416.57724: _low_level_execute_command(): starting 8454 1726882416.57727: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882414.3789287-8923-243136932533923/ > /dev/null 2>&1 && sleep 0' 8454 1726882416.58456: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882416.58532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882416.58586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882416.58590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882416.58621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882416.58742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882416.61140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882416.61144: stdout chunk (state=3): >>><<< 8454 1726882416.61146: stderr chunk (state=3): >>><<< 8454 1726882416.61149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882416.61151: handler run complete 8454 1726882416.61542: variable 'ansible_facts' from source: unknown 8454 1726882416.61959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882416.62715: variable 'ansible_facts' from source: unknown 8454 1726882416.62941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882416.63286: attempt loop complete, returning result 8454 1726882416.63300: _execute() done 8454 1726882416.63309: dumping result to json 8454 1726882416.63395: done dumping result, returning 8454 1726882416.63411: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-f59f-16b9-00000000018d] 8454 1726882416.63421: sending task result for task 0affe814-3a2d-f59f-16b9-00000000018d ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882416.64794: no more pending results, returning what we have 8454 1726882416.64798: results queue empty 8454 1726882416.64800: checking for any_errors_fatal 8454 1726882416.64804: done checking for any_errors_fatal 8454 1726882416.64806: checking for max_fail_percentage 8454 1726882416.64808: done checking for max_fail_percentage 8454 1726882416.64809: checking to see if all hosts have failed and the running result is not ok 8454 1726882416.64810: done checking to see if all hosts have failed 8454 1726882416.64811: getting the remaining hosts for this loop 8454 1726882416.64812: done getting the remaining hosts for this loop 8454 1726882416.64816: getting the next task for host managed_node3 8454 1726882416.64823: done getting next task for host managed_node3 8454 1726882416.64827: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 8454 1726882416.64831: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882416.64845: getting variables 8454 1726882416.64846: in VariableManager get_vars() 8454 1726882416.64882: Calling all_inventory to load vars for managed_node3 8454 1726882416.64886: Calling groups_inventory to load vars for managed_node3 8454 1726882416.64890: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882416.64900: Calling all_plugins_play to load vars for managed_node3 8454 1726882416.64903: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882416.64907: Calling groups_plugins_play to load vars for managed_node3 8454 1726882416.65492: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000018d 8454 1726882416.65496: WORKER PROCESS EXITING 8454 1726882416.65693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882416.66449: done with get_vars() 8454 1726882416.66466: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:36 -0400 (0:00:02.342) 0:00:14.683 ****** 8454 1726882416.66576: entering _queue_task() for managed_node3/package_facts 8454 1726882416.66578: Creating lock for package_facts 8454 1726882416.67061: worker is 1 (out of 1 available) 8454 1726882416.67071: exiting _queue_task() for managed_node3/package_facts 8454 1726882416.67083: done queuing things up, now waiting for results queue to drain 8454 1726882416.67084: waiting for pending results... 8454 1726882416.67187: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 8454 1726882416.67373: in run() - task 0affe814-3a2d-f59f-16b9-00000000018e 8454 1726882416.67397: variable 'ansible_search_path' from source: unknown 8454 1726882416.67406: variable 'ansible_search_path' from source: unknown 8454 1726882416.67458: calling self._execute() 8454 1726882416.67553: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882416.67566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882416.67582: variable 'omit' from source: magic vars 8454 1726882416.68018: variable 'ansible_distribution_major_version' from source: facts 8454 1726882416.68039: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882416.68050: variable 'omit' from source: magic vars 8454 1726882416.68152: variable 'omit' from source: magic vars 8454 1726882416.68204: variable 'omit' from source: magic vars 8454 1726882416.68252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882416.68304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882416.68395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882416.68398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882416.68401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882416.68412: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882416.68421: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882416.68429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882416.68562: Set connection var ansible_connection to ssh 8454 1726882416.68579: Set connection var ansible_shell_executable to /bin/sh 8454 1726882416.68592: Set connection var ansible_timeout to 10 8454 1726882416.68599: Set connection var ansible_shell_type to sh 8454 1726882416.68618: Set connection var ansible_pipelining to False 8454 1726882416.68630: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882416.68661: variable 'ansible_shell_executable' from source: unknown 8454 1726882416.68671: variable 'ansible_connection' from source: unknown 8454 1726882416.68680: variable 'ansible_module_compression' from source: unknown 8454 1726882416.68688: variable 'ansible_shell_type' from source: unknown 8454 1726882416.68720: variable 'ansible_shell_executable' from source: unknown 8454 1726882416.68723: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882416.68726: variable 'ansible_pipelining' from source: unknown 8454 1726882416.68728: variable 'ansible_timeout' from source: unknown 8454 1726882416.68731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882416.68960: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882416.69039: variable 'omit' from source: magic vars 8454 1726882416.69042: starting attempt loop 8454 1726882416.69046: running the handler 8454 1726882416.69049: _low_level_execute_command(): starting 8454 1726882416.69051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882416.69832: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882416.69840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882416.69900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882416.69917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882416.69947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882416.70380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882416.72017: stdout chunk (state=3): >>>/root <<< 8454 1726882416.72257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882416.72531: stderr chunk (state=3): >>><<< 8454 1726882416.72537: stdout chunk (state=3): >>><<< 8454 1726882416.72540: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882416.72544: _low_level_execute_command(): starting 8454 1726882416.72547: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426 `" && echo ansible-tmp-1726882416.7252226-9007-44193724747426="` echo /root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426 `" ) && sleep 0' 8454 1726882416.73917: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882416.73920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882416.73924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882416.73936: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882416.73939: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882416.74249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882416.74272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882416.74525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882416.76799: stdout chunk (state=3): >>>ansible-tmp-1726882416.7252226-9007-44193724747426=/root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426 <<< 8454 1726882416.76914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882416.76918: stdout chunk (state=3): >>><<< 8454 1726882416.76920: stderr chunk (state=3): >>><<< 8454 1726882416.76923: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882416.7252226-9007-44193724747426=/root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882416.76959: variable 'ansible_module_compression' from source: unknown 8454 1726882416.77014: ANSIBALLZ: Using lock for package_facts 8454 1726882416.77019: ANSIBALLZ: Acquiring lock 8454 1726882416.77023: ANSIBALLZ: Lock acquired: 140055525488016 8454 1726882416.77025: ANSIBALLZ: Creating module 8454 1726882417.23419: ANSIBALLZ: Writing module into payload 8454 1726882417.23606: ANSIBALLZ: Writing module 8454 1726882417.23655: ANSIBALLZ: Renaming module 8454 1726882417.23766: ANSIBALLZ: Done creating module 8454 1726882417.23769: variable 'ansible_facts' from source: unknown 8454 1726882417.24136: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/AnsiballZ_package_facts.py 8454 1726882417.24865: Sending initial data 8454 1726882417.24869: Sent initial data (159 bytes) 8454 1726882417.25485: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882417.25504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882417.25660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882417.27468: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882417.27620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882417.27732: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpcnt1rogp /root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/AnsiballZ_package_facts.py <<< 8454 1726882417.27737: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/AnsiballZ_package_facts.py" <<< 8454 1726882417.27867: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpcnt1rogp" to remote "/root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/AnsiballZ_package_facts.py" <<< 8454 1726882417.32777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882417.32832: stderr chunk (state=3): >>><<< 8454 1726882417.32838: stdout chunk (state=3): >>><<< 8454 1726882417.32865: done transferring module to remote 8454 1726882417.32880: _low_level_execute_command(): starting 8454 1726882417.32889: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/ /root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/AnsiballZ_package_facts.py && sleep 0' 8454 1726882417.34096: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882417.34100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882417.34123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882417.34131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882417.34141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882417.34335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882417.34343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882417.34470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882417.34585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882417.36732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882417.36758: stderr chunk (state=3): >>><<< 8454 1726882417.36762: stdout chunk (state=3): >>><<< 8454 1726882417.36829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882417.36833: _low_level_execute_command(): starting 8454 1726882417.36839: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/AnsiballZ_package_facts.py && sleep 0' 8454 1726882417.38143: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882417.38291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882417.38363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882417.38423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882417.38456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882417.38468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882417.38832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882418.02840: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 8454 1726882418.02944: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 8454 1726882418.03030: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "per<<< 8454 1726882418.03059: stdout chunk (state=3): >>>l-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 8454 1726882418.03098: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 8454 1726882418.04994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882418.05047: stderr chunk (state=3): >>><<< 8454 1726882418.05050: stdout chunk (state=3): >>><<< 8454 1726882418.05088: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882418.08602: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882418.08620: _low_level_execute_command(): starting 8454 1726882418.08626: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882416.7252226-9007-44193724747426/ > /dev/null 2>&1 && sleep 0' 8454 1726882418.09069: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882418.09073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882418.09075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882418.09078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882418.09080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882418.09132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882418.09143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882418.09260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882418.11266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882418.11315: stderr chunk (state=3): >>><<< 8454 1726882418.11318: stdout chunk (state=3): >>><<< 8454 1726882418.11331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882418.11340: handler run complete 8454 1726882418.12112: variable 'ansible_facts' from source: unknown 8454 1726882418.12529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.14508: variable 'ansible_facts' from source: unknown 8454 1726882418.14918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.15682: attempt loop complete, returning result 8454 1726882418.15697: _execute() done 8454 1726882418.15700: dumping result to json 8454 1726882418.15883: done dumping result, returning 8454 1726882418.15890: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-f59f-16b9-00000000018e] 8454 1726882418.15896: sending task result for task 0affe814-3a2d-f59f-16b9-00000000018e 8454 1726882418.17729: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000018e 8454 1726882418.17733: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882418.17777: no more pending results, returning what we have 8454 1726882418.17782: results queue empty 8454 1726882418.17784: checking for any_errors_fatal 8454 1726882418.17789: done checking for any_errors_fatal 8454 1726882418.17790: checking for max_fail_percentage 8454 1726882418.17791: done checking for max_fail_percentage 8454 1726882418.17792: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.17792: done checking to see if all hosts have failed 8454 1726882418.17793: getting the remaining hosts for this loop 8454 1726882418.17794: done getting the remaining hosts for this loop 8454 1726882418.17797: getting the next task for host managed_node3 8454 1726882418.17802: done getting next task for host managed_node3 8454 1726882418.17805: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 8454 1726882418.17807: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.17815: getting variables 8454 1726882418.17816: in VariableManager get_vars() 8454 1726882418.17846: Calling all_inventory to load vars for managed_node3 8454 1726882418.17849: Calling groups_inventory to load vars for managed_node3 8454 1726882418.17850: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.17858: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.17860: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.17862: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.19012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.20552: done with get_vars() 8454 1726882418.20572: done getting variables 8454 1726882418.20624: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:38 -0400 (0:00:01.540) 0:00:16.224 ****** 8454 1726882418.20652: entering _queue_task() for managed_node3/debug 8454 1726882418.20874: worker is 1 (out of 1 available) 8454 1726882418.20888: exiting _queue_task() for managed_node3/debug 8454 1726882418.20904: done queuing things up, now waiting for results queue to drain 8454 1726882418.20906: waiting for pending results... 8454 1726882418.21082: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 8454 1726882418.21188: in run() - task 0affe814-3a2d-f59f-16b9-000000000027 8454 1726882418.21202: variable 'ansible_search_path' from source: unknown 8454 1726882418.21206: variable 'ansible_search_path' from source: unknown 8454 1726882418.21244: calling self._execute() 8454 1726882418.21312: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.21318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.21330: variable 'omit' from source: magic vars 8454 1726882418.21647: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.21656: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882418.21663: variable 'omit' from source: magic vars 8454 1726882418.21713: variable 'omit' from source: magic vars 8454 1726882418.21796: variable 'network_provider' from source: set_fact 8454 1726882418.21815: variable 'omit' from source: magic vars 8454 1726882418.21852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882418.21884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882418.21903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882418.21922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882418.21932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882418.21964: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882418.21968: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.21971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.22057: Set connection var ansible_connection to ssh 8454 1726882418.22066: Set connection var ansible_shell_executable to /bin/sh 8454 1726882418.22073: Set connection var ansible_timeout to 10 8454 1726882418.22076: Set connection var ansible_shell_type to sh 8454 1726882418.22086: Set connection var ansible_pipelining to False 8454 1726882418.22093: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882418.22115: variable 'ansible_shell_executable' from source: unknown 8454 1726882418.22119: variable 'ansible_connection' from source: unknown 8454 1726882418.22123: variable 'ansible_module_compression' from source: unknown 8454 1726882418.22125: variable 'ansible_shell_type' from source: unknown 8454 1726882418.22128: variable 'ansible_shell_executable' from source: unknown 8454 1726882418.22131: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.22139: variable 'ansible_pipelining' from source: unknown 8454 1726882418.22142: variable 'ansible_timeout' from source: unknown 8454 1726882418.22145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.22262: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882418.22272: variable 'omit' from source: magic vars 8454 1726882418.22277: starting attempt loop 8454 1726882418.22283: running the handler 8454 1726882418.22319: handler run complete 8454 1726882418.22336: attempt loop complete, returning result 8454 1726882418.22340: _execute() done 8454 1726882418.22343: dumping result to json 8454 1726882418.22347: done dumping result, returning 8454 1726882418.22356: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-f59f-16b9-000000000027] 8454 1726882418.22363: sending task result for task 0affe814-3a2d-f59f-16b9-000000000027 8454 1726882418.22454: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000027 8454 1726882418.22457: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 8454 1726882418.22522: no more pending results, returning what we have 8454 1726882418.22525: results queue empty 8454 1726882418.22526: checking for any_errors_fatal 8454 1726882418.22533: done checking for any_errors_fatal 8454 1726882418.22535: checking for max_fail_percentage 8454 1726882418.22537: done checking for max_fail_percentage 8454 1726882418.22538: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.22539: done checking to see if all hosts have failed 8454 1726882418.22540: getting the remaining hosts for this loop 8454 1726882418.22541: done getting the remaining hosts for this loop 8454 1726882418.22545: getting the next task for host managed_node3 8454 1726882418.22550: done getting next task for host managed_node3 8454 1726882418.22554: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8454 1726882418.22557: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.22568: getting variables 8454 1726882418.22569: in VariableManager get_vars() 8454 1726882418.22607: Calling all_inventory to load vars for managed_node3 8454 1726882418.22610: Calling groups_inventory to load vars for managed_node3 8454 1726882418.22612: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.22621: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.22623: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.22626: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.23796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.25346: done with get_vars() 8454 1726882418.25369: done getting variables 8454 1726882418.25441: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:38 -0400 (0:00:00.048) 0:00:16.272 ****** 8454 1726882418.25467: entering _queue_task() for managed_node3/fail 8454 1726882418.25469: Creating lock for fail 8454 1726882418.25676: worker is 1 (out of 1 available) 8454 1726882418.25693: exiting _queue_task() for managed_node3/fail 8454 1726882418.25707: done queuing things up, now waiting for results queue to drain 8454 1726882418.25709: waiting for pending results... 8454 1726882418.25874: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8454 1726882418.25967: in run() - task 0affe814-3a2d-f59f-16b9-000000000028 8454 1726882418.25982: variable 'ansible_search_path' from source: unknown 8454 1726882418.25986: variable 'ansible_search_path' from source: unknown 8454 1726882418.26014: calling self._execute() 8454 1726882418.26085: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.26089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.26100: variable 'omit' from source: magic vars 8454 1726882418.26390: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.26402: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882418.26504: variable 'network_state' from source: role '' defaults 8454 1726882418.26514: Evaluated conditional (network_state != {}): False 8454 1726882418.26517: when evaluation is False, skipping this task 8454 1726882418.26520: _execute() done 8454 1726882418.26525: dumping result to json 8454 1726882418.26529: done dumping result, returning 8454 1726882418.26538: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-f59f-16b9-000000000028] 8454 1726882418.26544: sending task result for task 0affe814-3a2d-f59f-16b9-000000000028 8454 1726882418.26638: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000028 8454 1726882418.26642: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882418.26693: no more pending results, returning what we have 8454 1726882418.26696: results queue empty 8454 1726882418.26697: checking for any_errors_fatal 8454 1726882418.26702: done checking for any_errors_fatal 8454 1726882418.26703: checking for max_fail_percentage 8454 1726882418.26705: done checking for max_fail_percentage 8454 1726882418.26706: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.26706: done checking to see if all hosts have failed 8454 1726882418.26707: getting the remaining hosts for this loop 8454 1726882418.26709: done getting the remaining hosts for this loop 8454 1726882418.26712: getting the next task for host managed_node3 8454 1726882418.26718: done getting next task for host managed_node3 8454 1726882418.26721: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8454 1726882418.26724: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.26747: getting variables 8454 1726882418.26749: in VariableManager get_vars() 8454 1726882418.26781: Calling all_inventory to load vars for managed_node3 8454 1726882418.26783: Calling groups_inventory to load vars for managed_node3 8454 1726882418.26785: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.26792: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.26794: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.26797: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.28094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.29836: done with get_vars() 8454 1726882418.29867: done getting variables 8454 1726882418.29935: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:38 -0400 (0:00:00.044) 0:00:16.317 ****** 8454 1726882418.29969: entering _queue_task() for managed_node3/fail 8454 1726882418.30214: worker is 1 (out of 1 available) 8454 1726882418.30229: exiting _queue_task() for managed_node3/fail 8454 1726882418.30345: done queuing things up, now waiting for results queue to drain 8454 1726882418.30348: waiting for pending results... 8454 1726882418.30593: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8454 1726882418.30687: in run() - task 0affe814-3a2d-f59f-16b9-000000000029 8454 1726882418.30700: variable 'ansible_search_path' from source: unknown 8454 1726882418.30706: variable 'ansible_search_path' from source: unknown 8454 1726882418.30740: calling self._execute() 8454 1726882418.30807: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.30816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.30830: variable 'omit' from source: magic vars 8454 1726882418.31155: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.31166: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882418.31272: variable 'network_state' from source: role '' defaults 8454 1726882418.31280: Evaluated conditional (network_state != {}): False 8454 1726882418.31286: when evaluation is False, skipping this task 8454 1726882418.31290: _execute() done 8454 1726882418.31295: dumping result to json 8454 1726882418.31300: done dumping result, returning 8454 1726882418.31308: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-f59f-16b9-000000000029] 8454 1726882418.31313: sending task result for task 0affe814-3a2d-f59f-16b9-000000000029 8454 1726882418.31405: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000029 8454 1726882418.31408: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882418.31461: no more pending results, returning what we have 8454 1726882418.31465: results queue empty 8454 1726882418.31466: checking for any_errors_fatal 8454 1726882418.31471: done checking for any_errors_fatal 8454 1726882418.31472: checking for max_fail_percentage 8454 1726882418.31474: done checking for max_fail_percentage 8454 1726882418.31475: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.31476: done checking to see if all hosts have failed 8454 1726882418.31476: getting the remaining hosts for this loop 8454 1726882418.31478: done getting the remaining hosts for this loop 8454 1726882418.31482: getting the next task for host managed_node3 8454 1726882418.31487: done getting next task for host managed_node3 8454 1726882418.31491: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8454 1726882418.31494: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.31508: getting variables 8454 1726882418.31509: in VariableManager get_vars() 8454 1726882418.31548: Calling all_inventory to load vars for managed_node3 8454 1726882418.31550: Calling groups_inventory to load vars for managed_node3 8454 1726882418.31552: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.31559: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.31561: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.31563: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.32945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.35818: done with get_vars() 8454 1726882418.35852: done getting variables 8454 1726882418.35916: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:38 -0400 (0:00:00.059) 0:00:16.377 ****** 8454 1726882418.35955: entering _queue_task() for managed_node3/fail 8454 1726882418.36196: worker is 1 (out of 1 available) 8454 1726882418.36211: exiting _queue_task() for managed_node3/fail 8454 1726882418.36226: done queuing things up, now waiting for results queue to drain 8454 1726882418.36228: waiting for pending results... 8454 1726882418.36655: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8454 1726882418.36663: in run() - task 0affe814-3a2d-f59f-16b9-00000000002a 8454 1726882418.36668: variable 'ansible_search_path' from source: unknown 8454 1726882418.36671: variable 'ansible_search_path' from source: unknown 8454 1726882418.36717: calling self._execute() 8454 1726882418.36817: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.36830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.36849: variable 'omit' from source: magic vars 8454 1726882418.37275: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.37302: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882418.37533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882418.40268: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882418.40359: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882418.40521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882418.40525: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882418.40528: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882418.40615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.40681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.40721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.40809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.40854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.40996: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.41069: Evaluated conditional (ansible_distribution_major_version | int > 9): True 8454 1726882418.41240: variable 'ansible_distribution' from source: facts 8454 1726882418.41251: variable '__network_rh_distros' from source: role '' defaults 8454 1726882418.41266: Evaluated conditional (ansible_distribution in __network_rh_distros): False 8454 1726882418.41276: when evaluation is False, skipping this task 8454 1726882418.41318: _execute() done 8454 1726882418.41330: dumping result to json 8454 1726882418.41335: done dumping result, returning 8454 1726882418.41424: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-f59f-16b9-00000000002a] 8454 1726882418.41427: sending task result for task 0affe814-3a2d-f59f-16b9-00000000002a 8454 1726882418.41505: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000002a 8454 1726882418.41509: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 8454 1726882418.41566: no more pending results, returning what we have 8454 1726882418.41571: results queue empty 8454 1726882418.41572: checking for any_errors_fatal 8454 1726882418.41578: done checking for any_errors_fatal 8454 1726882418.41578: checking for max_fail_percentage 8454 1726882418.41581: done checking for max_fail_percentage 8454 1726882418.41582: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.41583: done checking to see if all hosts have failed 8454 1726882418.41584: getting the remaining hosts for this loop 8454 1726882418.41587: done getting the remaining hosts for this loop 8454 1726882418.41592: getting the next task for host managed_node3 8454 1726882418.41599: done getting next task for host managed_node3 8454 1726882418.41604: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8454 1726882418.41608: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.41625: getting variables 8454 1726882418.41627: in VariableManager get_vars() 8454 1726882418.41680: Calling all_inventory to load vars for managed_node3 8454 1726882418.41683: Calling groups_inventory to load vars for managed_node3 8454 1726882418.41686: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.41699: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.41703: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.41706: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.43395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.45439: done with get_vars() 8454 1726882418.45463: done getting variables 8454 1726882418.45561: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:38 -0400 (0:00:00.096) 0:00:16.473 ****** 8454 1726882418.45587: entering _queue_task() for managed_node3/dnf 8454 1726882418.45801: worker is 1 (out of 1 available) 8454 1726882418.45815: exiting _queue_task() for managed_node3/dnf 8454 1726882418.45829: done queuing things up, now waiting for results queue to drain 8454 1726882418.45831: waiting for pending results... 8454 1726882418.46004: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8454 1726882418.46107: in run() - task 0affe814-3a2d-f59f-16b9-00000000002b 8454 1726882418.46120: variable 'ansible_search_path' from source: unknown 8454 1726882418.46123: variable 'ansible_search_path' from source: unknown 8454 1726882418.46161: calling self._execute() 8454 1726882418.46235: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.46241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.46252: variable 'omit' from source: magic vars 8454 1726882418.46555: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.46566: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882418.46739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882418.49299: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882418.49357: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882418.49401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882418.49431: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882418.49454: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882418.49524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.49549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.49571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.49611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.49624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.49719: variable 'ansible_distribution' from source: facts 8454 1726882418.49723: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.49731: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 8454 1726882418.49823: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882418.49943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.49963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.49986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.50019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.50036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.50072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.50094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.50114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.50151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.50164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.50200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.50219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.50244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.50277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.50292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.50422: variable 'network_connections' from source: task vars 8454 1726882418.50432: variable 'controller_profile' from source: play vars 8454 1726882418.50491: variable 'controller_profile' from source: play vars 8454 1726882418.50500: variable 'controller_device' from source: play vars 8454 1726882418.50553: variable 'controller_device' from source: play vars 8454 1726882418.50563: variable 'port1_profile' from source: play vars 8454 1726882418.50618: variable 'port1_profile' from source: play vars 8454 1726882418.50625: variable 'dhcp_interface1' from source: play vars 8454 1726882418.50678: variable 'dhcp_interface1' from source: play vars 8454 1726882418.50691: variable 'controller_profile' from source: play vars 8454 1726882418.50739: variable 'controller_profile' from source: play vars 8454 1726882418.50746: variable 'port2_profile' from source: play vars 8454 1726882418.50801: variable 'port2_profile' from source: play vars 8454 1726882418.50808: variable 'dhcp_interface2' from source: play vars 8454 1726882418.50860: variable 'dhcp_interface2' from source: play vars 8454 1726882418.50867: variable 'controller_profile' from source: play vars 8454 1726882418.50948: variable 'controller_profile' from source: play vars 8454 1726882418.51008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882418.51188: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882418.51438: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882418.51442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882418.51445: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882418.51447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882418.51449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882418.51451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.51463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882418.51545: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882418.51889: variable 'network_connections' from source: task vars 8454 1726882418.51903: variable 'controller_profile' from source: play vars 8454 1726882418.51985: variable 'controller_profile' from source: play vars 8454 1726882418.52001: variable 'controller_device' from source: play vars 8454 1726882418.52083: variable 'controller_device' from source: play vars 8454 1726882418.52100: variable 'port1_profile' from source: play vars 8454 1726882418.52167: variable 'port1_profile' from source: play vars 8454 1726882418.52175: variable 'dhcp_interface1' from source: play vars 8454 1726882418.52239: variable 'dhcp_interface1' from source: play vars 8454 1726882418.52247: variable 'controller_profile' from source: play vars 8454 1726882418.52308: variable 'controller_profile' from source: play vars 8454 1726882418.52316: variable 'port2_profile' from source: play vars 8454 1726882418.52440: variable 'port2_profile' from source: play vars 8454 1726882418.52444: variable 'dhcp_interface2' from source: play vars 8454 1726882418.52468: variable 'dhcp_interface2' from source: play vars 8454 1726882418.52481: variable 'controller_profile' from source: play vars 8454 1726882418.52561: variable 'controller_profile' from source: play vars 8454 1726882418.52603: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8454 1726882418.52612: when evaluation is False, skipping this task 8454 1726882418.52621: _execute() done 8454 1726882418.52630: dumping result to json 8454 1726882418.52643: done dumping result, returning 8454 1726882418.52657: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-f59f-16b9-00000000002b] 8454 1726882418.52673: sending task result for task 0affe814-3a2d-f59f-16b9-00000000002b skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8454 1726882418.52895: no more pending results, returning what we have 8454 1726882418.52899: results queue empty 8454 1726882418.52900: checking for any_errors_fatal 8454 1726882418.52906: done checking for any_errors_fatal 8454 1726882418.52907: checking for max_fail_percentage 8454 1726882418.52909: done checking for max_fail_percentage 8454 1726882418.52911: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.52912: done checking to see if all hosts have failed 8454 1726882418.52913: getting the remaining hosts for this loop 8454 1726882418.52915: done getting the remaining hosts for this loop 8454 1726882418.52920: getting the next task for host managed_node3 8454 1726882418.52928: done getting next task for host managed_node3 8454 1726882418.52933: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8454 1726882418.52939: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.52955: getting variables 8454 1726882418.52957: in VariableManager get_vars() 8454 1726882418.53004: Calling all_inventory to load vars for managed_node3 8454 1726882418.53007: Calling groups_inventory to load vars for managed_node3 8454 1726882418.53011: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.53023: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.53027: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.53031: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.53246: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000002b 8454 1726882418.53250: WORKER PROCESS EXITING 8454 1726882418.54652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.56673: done with get_vars() 8454 1726882418.56695: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8454 1726882418.56756: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:38 -0400 (0:00:00.111) 0:00:16.585 ****** 8454 1726882418.56782: entering _queue_task() for managed_node3/yum 8454 1726882418.56783: Creating lock for yum 8454 1726882418.56997: worker is 1 (out of 1 available) 8454 1726882418.57011: exiting _queue_task() for managed_node3/yum 8454 1726882418.57025: done queuing things up, now waiting for results queue to drain 8454 1726882418.57027: waiting for pending results... 8454 1726882418.57209: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8454 1726882418.57310: in run() - task 0affe814-3a2d-f59f-16b9-00000000002c 8454 1726882418.57324: variable 'ansible_search_path' from source: unknown 8454 1726882418.57328: variable 'ansible_search_path' from source: unknown 8454 1726882418.57363: calling self._execute() 8454 1726882418.57431: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.57439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.57450: variable 'omit' from source: magic vars 8454 1726882418.57761: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.57772: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882418.57926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882418.59642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882418.59698: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882418.59728: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882418.59764: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882418.59790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882418.59857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.59885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.59908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.59942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.59954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.60033: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.60048: Evaluated conditional (ansible_distribution_major_version | int < 8): False 8454 1726882418.60051: when evaluation is False, skipping this task 8454 1726882418.60054: _execute() done 8454 1726882418.60060: dumping result to json 8454 1726882418.60064: done dumping result, returning 8454 1726882418.60072: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-f59f-16b9-00000000002c] 8454 1726882418.60080: sending task result for task 0affe814-3a2d-f59f-16b9-00000000002c 8454 1726882418.60171: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000002c 8454 1726882418.60174: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 8454 1726882418.60247: no more pending results, returning what we have 8454 1726882418.60251: results queue empty 8454 1726882418.60252: checking for any_errors_fatal 8454 1726882418.60257: done checking for any_errors_fatal 8454 1726882418.60258: checking for max_fail_percentage 8454 1726882418.60259: done checking for max_fail_percentage 8454 1726882418.60260: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.60261: done checking to see if all hosts have failed 8454 1726882418.60262: getting the remaining hosts for this loop 8454 1726882418.60265: done getting the remaining hosts for this loop 8454 1726882418.60269: getting the next task for host managed_node3 8454 1726882418.60275: done getting next task for host managed_node3 8454 1726882418.60279: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8454 1726882418.60282: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.60297: getting variables 8454 1726882418.60298: in VariableManager get_vars() 8454 1726882418.60338: Calling all_inventory to load vars for managed_node3 8454 1726882418.60341: Calling groups_inventory to load vars for managed_node3 8454 1726882418.60344: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.60359: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.60361: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.60364: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.61597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.63126: done with get_vars() 8454 1726882418.63147: done getting variables 8454 1726882418.63195: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:38 -0400 (0:00:00.064) 0:00:16.649 ****** 8454 1726882418.63220: entering _queue_task() for managed_node3/fail 8454 1726882418.63419: worker is 1 (out of 1 available) 8454 1726882418.63433: exiting _queue_task() for managed_node3/fail 8454 1726882418.63449: done queuing things up, now waiting for results queue to drain 8454 1726882418.63451: waiting for pending results... 8454 1726882418.63622: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8454 1726882418.63718: in run() - task 0affe814-3a2d-f59f-16b9-00000000002d 8454 1726882418.63731: variable 'ansible_search_path' from source: unknown 8454 1726882418.63736: variable 'ansible_search_path' from source: unknown 8454 1726882418.63769: calling self._execute() 8454 1726882418.63839: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.63846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.63856: variable 'omit' from source: magic vars 8454 1726882418.64160: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.64171: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882418.64272: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882418.64447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882418.66137: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882418.66193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882418.66225: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882418.66257: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882418.66282: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882418.66354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.66377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.66406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.66440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.66453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.66496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.66518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.66543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.66574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.66589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.66626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.66650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.66670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.66703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.66715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.66862: variable 'network_connections' from source: task vars 8454 1726882418.66873: variable 'controller_profile' from source: play vars 8454 1726882418.66930: variable 'controller_profile' from source: play vars 8454 1726882418.66941: variable 'controller_device' from source: play vars 8454 1726882418.66997: variable 'controller_device' from source: play vars 8454 1726882418.67007: variable 'port1_profile' from source: play vars 8454 1726882418.67062: variable 'port1_profile' from source: play vars 8454 1726882418.67067: variable 'dhcp_interface1' from source: play vars 8454 1726882418.67118: variable 'dhcp_interface1' from source: play vars 8454 1726882418.67124: variable 'controller_profile' from source: play vars 8454 1726882418.67178: variable 'controller_profile' from source: play vars 8454 1726882418.67187: variable 'port2_profile' from source: play vars 8454 1726882418.67238: variable 'port2_profile' from source: play vars 8454 1726882418.67245: variable 'dhcp_interface2' from source: play vars 8454 1726882418.67300: variable 'dhcp_interface2' from source: play vars 8454 1726882418.67307: variable 'controller_profile' from source: play vars 8454 1726882418.67359: variable 'controller_profile' from source: play vars 8454 1726882418.67422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882418.67568: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882418.67607: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882418.67631: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882418.67659: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882418.67698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882418.67719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882418.67744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.67765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882418.67825: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882418.68028: variable 'network_connections' from source: task vars 8454 1726882418.68032: variable 'controller_profile' from source: play vars 8454 1726882418.68091: variable 'controller_profile' from source: play vars 8454 1726882418.68098: variable 'controller_device' from source: play vars 8454 1726882418.68154: variable 'controller_device' from source: play vars 8454 1726882418.68159: variable 'port1_profile' from source: play vars 8454 1726882418.68211: variable 'port1_profile' from source: play vars 8454 1726882418.68218: variable 'dhcp_interface1' from source: play vars 8454 1726882418.68271: variable 'dhcp_interface1' from source: play vars 8454 1726882418.68278: variable 'controller_profile' from source: play vars 8454 1726882418.68328: variable 'controller_profile' from source: play vars 8454 1726882418.68337: variable 'port2_profile' from source: play vars 8454 1726882418.68391: variable 'port2_profile' from source: play vars 8454 1726882418.68398: variable 'dhcp_interface2' from source: play vars 8454 1726882418.68448: variable 'dhcp_interface2' from source: play vars 8454 1726882418.68455: variable 'controller_profile' from source: play vars 8454 1726882418.68509: variable 'controller_profile' from source: play vars 8454 1726882418.68539: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8454 1726882418.68542: when evaluation is False, skipping this task 8454 1726882418.68545: _execute() done 8454 1726882418.68550: dumping result to json 8454 1726882418.68554: done dumping result, returning 8454 1726882418.68563: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-f59f-16b9-00000000002d] 8454 1726882418.68569: sending task result for task 0affe814-3a2d-f59f-16b9-00000000002d 8454 1726882418.68663: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000002d 8454 1726882418.68666: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8454 1726882418.68753: no more pending results, returning what we have 8454 1726882418.68757: results queue empty 8454 1726882418.68758: checking for any_errors_fatal 8454 1726882418.68764: done checking for any_errors_fatal 8454 1726882418.68765: checking for max_fail_percentage 8454 1726882418.68767: done checking for max_fail_percentage 8454 1726882418.68768: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.68769: done checking to see if all hosts have failed 8454 1726882418.68770: getting the remaining hosts for this loop 8454 1726882418.68772: done getting the remaining hosts for this loop 8454 1726882418.68778: getting the next task for host managed_node3 8454 1726882418.68784: done getting next task for host managed_node3 8454 1726882418.68789: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 8454 1726882418.68793: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.68809: getting variables 8454 1726882418.68810: in VariableManager get_vars() 8454 1726882418.68851: Calling all_inventory to load vars for managed_node3 8454 1726882418.68854: Calling groups_inventory to load vars for managed_node3 8454 1726882418.68857: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.68867: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.68870: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.68874: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.70102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882418.71641: done with get_vars() 8454 1726882418.71663: done getting variables 8454 1726882418.71712: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:38 -0400 (0:00:00.085) 0:00:16.735 ****** 8454 1726882418.71740: entering _queue_task() for managed_node3/package 8454 1726882418.71962: worker is 1 (out of 1 available) 8454 1726882418.71974: exiting _queue_task() for managed_node3/package 8454 1726882418.71989: done queuing things up, now waiting for results queue to drain 8454 1726882418.71990: waiting for pending results... 8454 1726882418.72177: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 8454 1726882418.72271: in run() - task 0affe814-3a2d-f59f-16b9-00000000002e 8454 1726882418.72286: variable 'ansible_search_path' from source: unknown 8454 1726882418.72290: variable 'ansible_search_path' from source: unknown 8454 1726882418.72323: calling self._execute() 8454 1726882418.72397: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882418.72403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882418.72414: variable 'omit' from source: magic vars 8454 1726882418.72720: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.72730: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882418.72898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882418.73110: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882418.73146: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882418.73174: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882418.73205: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882418.73293: variable 'network_packages' from source: role '' defaults 8454 1726882418.73385: variable '__network_provider_setup' from source: role '' defaults 8454 1726882418.73394: variable '__network_service_name_default_nm' from source: role '' defaults 8454 1726882418.73451: variable '__network_service_name_default_nm' from source: role '' defaults 8454 1726882418.73459: variable '__network_packages_default_nm' from source: role '' defaults 8454 1726882418.73511: variable '__network_packages_default_nm' from source: role '' defaults 8454 1726882418.73673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882418.78673: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882418.78733: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882418.78761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882418.78789: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882418.78811: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882418.78871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.78895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.78917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.78955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.78968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.79006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.79025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.79049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.79084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.79095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.79277: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8454 1726882418.79371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.79395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.79415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.79539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.79543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.79598: variable 'ansible_python' from source: facts 8454 1726882418.79629: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8454 1726882418.79731: variable '__network_wpa_supplicant_required' from source: role '' defaults 8454 1726882418.79831: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8454 1726882418.79994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.80028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.80065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.80112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.80130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.80191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882418.80339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882418.80342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.80344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882418.80346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882418.80495: variable 'network_connections' from source: task vars 8454 1726882418.80507: variable 'controller_profile' from source: play vars 8454 1726882418.80618: variable 'controller_profile' from source: play vars 8454 1726882418.80633: variable 'controller_device' from source: play vars 8454 1726882418.80754: variable 'controller_device' from source: play vars 8454 1726882418.80772: variable 'port1_profile' from source: play vars 8454 1726882418.80882: variable 'port1_profile' from source: play vars 8454 1726882418.80899: variable 'dhcp_interface1' from source: play vars 8454 1726882418.81016: variable 'dhcp_interface1' from source: play vars 8454 1726882418.81031: variable 'controller_profile' from source: play vars 8454 1726882418.81151: variable 'controller_profile' from source: play vars 8454 1726882418.81167: variable 'port2_profile' from source: play vars 8454 1726882418.81283: variable 'port2_profile' from source: play vars 8454 1726882418.81301: variable 'dhcp_interface2' from source: play vars 8454 1726882418.81416: variable 'dhcp_interface2' from source: play vars 8454 1726882418.81431: variable 'controller_profile' from source: play vars 8454 1726882418.81548: variable 'controller_profile' from source: play vars 8454 1726882418.81625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882418.81739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882418.81744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882418.81747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882418.81793: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882418.82173: variable 'network_connections' from source: task vars 8454 1726882418.82186: variable 'controller_profile' from source: play vars 8454 1726882418.82303: variable 'controller_profile' from source: play vars 8454 1726882418.82321: variable 'controller_device' from source: play vars 8454 1726882418.82441: variable 'controller_device' from source: play vars 8454 1726882418.82459: variable 'port1_profile' from source: play vars 8454 1726882418.82576: variable 'port1_profile' from source: play vars 8454 1726882418.82595: variable 'dhcp_interface1' from source: play vars 8454 1726882418.82715: variable 'dhcp_interface1' from source: play vars 8454 1726882418.82732: variable 'controller_profile' from source: play vars 8454 1726882418.82943: variable 'controller_profile' from source: play vars 8454 1726882418.82946: variable 'port2_profile' from source: play vars 8454 1726882418.82991: variable 'port2_profile' from source: play vars 8454 1726882418.83006: variable 'dhcp_interface2' from source: play vars 8454 1726882418.83123: variable 'dhcp_interface2' from source: play vars 8454 1726882418.83145: variable 'controller_profile' from source: play vars 8454 1726882418.83270: variable 'controller_profile' from source: play vars 8454 1726882418.83343: variable '__network_packages_default_wireless' from source: role '' defaults 8454 1726882418.83453: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882418.83875: variable 'network_connections' from source: task vars 8454 1726882418.83887: variable 'controller_profile' from source: play vars 8454 1726882418.83977: variable 'controller_profile' from source: play vars 8454 1726882418.83992: variable 'controller_device' from source: play vars 8454 1726882418.84080: variable 'controller_device' from source: play vars 8454 1726882418.84097: variable 'port1_profile' from source: play vars 8454 1726882418.84339: variable 'port1_profile' from source: play vars 8454 1726882418.84342: variable 'dhcp_interface1' from source: play vars 8454 1726882418.84344: variable 'dhcp_interface1' from source: play vars 8454 1726882418.84346: variable 'controller_profile' from source: play vars 8454 1726882418.84348: variable 'controller_profile' from source: play vars 8454 1726882418.84351: variable 'port2_profile' from source: play vars 8454 1726882418.84425: variable 'port2_profile' from source: play vars 8454 1726882418.84442: variable 'dhcp_interface2' from source: play vars 8454 1726882418.84517: variable 'dhcp_interface2' from source: play vars 8454 1726882418.84530: variable 'controller_profile' from source: play vars 8454 1726882418.84604: variable 'controller_profile' from source: play vars 8454 1726882418.84640: variable '__network_packages_default_team' from source: role '' defaults 8454 1726882418.84732: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882418.85118: variable 'network_connections' from source: task vars 8454 1726882418.85129: variable 'controller_profile' from source: play vars 8454 1726882418.85206: variable 'controller_profile' from source: play vars 8454 1726882418.85220: variable 'controller_device' from source: play vars 8454 1726882418.85297: variable 'controller_device' from source: play vars 8454 1726882418.85313: variable 'port1_profile' from source: play vars 8454 1726882418.85390: variable 'port1_profile' from source: play vars 8454 1726882418.85403: variable 'dhcp_interface1' from source: play vars 8454 1726882418.85479: variable 'dhcp_interface1' from source: play vars 8454 1726882418.85494: variable 'controller_profile' from source: play vars 8454 1726882418.85570: variable 'controller_profile' from source: play vars 8454 1726882418.85584: variable 'port2_profile' from source: play vars 8454 1726882418.85659: variable 'port2_profile' from source: play vars 8454 1726882418.85673: variable 'dhcp_interface2' from source: play vars 8454 1726882418.85754: variable 'dhcp_interface2' from source: play vars 8454 1726882418.85766: variable 'controller_profile' from source: play vars 8454 1726882418.85839: variable 'controller_profile' from source: play vars 8454 1726882418.85920: variable '__network_service_name_default_initscripts' from source: role '' defaults 8454 1726882418.85992: variable '__network_service_name_default_initscripts' from source: role '' defaults 8454 1726882418.86005: variable '__network_packages_default_initscripts' from source: role '' defaults 8454 1726882418.86075: variable '__network_packages_default_initscripts' from source: role '' defaults 8454 1726882418.86439: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8454 1726882418.87022: variable 'network_connections' from source: task vars 8454 1726882418.87037: variable 'controller_profile' from source: play vars 8454 1726882418.87115: variable 'controller_profile' from source: play vars 8454 1726882418.87137: variable 'controller_device' from source: play vars 8454 1726882418.87216: variable 'controller_device' from source: play vars 8454 1726882418.87235: variable 'port1_profile' from source: play vars 8454 1726882418.87440: variable 'port1_profile' from source: play vars 8454 1726882418.87444: variable 'dhcp_interface1' from source: play vars 8454 1726882418.87448: variable 'dhcp_interface1' from source: play vars 8454 1726882418.87451: variable 'controller_profile' from source: play vars 8454 1726882418.87493: variable 'controller_profile' from source: play vars 8454 1726882418.87508: variable 'port2_profile' from source: play vars 8454 1726882418.87588: variable 'port2_profile' from source: play vars 8454 1726882418.87603: variable 'dhcp_interface2' from source: play vars 8454 1726882418.87684: variable 'dhcp_interface2' from source: play vars 8454 1726882418.87699: variable 'controller_profile' from source: play vars 8454 1726882418.87776: variable 'controller_profile' from source: play vars 8454 1726882418.87791: variable 'ansible_distribution' from source: facts 8454 1726882418.87800: variable '__network_rh_distros' from source: role '' defaults 8454 1726882418.87811: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.87846: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8454 1726882418.88061: variable 'ansible_distribution' from source: facts 8454 1726882418.88071: variable '__network_rh_distros' from source: role '' defaults 8454 1726882418.88082: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.88095: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8454 1726882418.88310: variable 'ansible_distribution' from source: facts 8454 1726882418.88321: variable '__network_rh_distros' from source: role '' defaults 8454 1726882418.88336: variable 'ansible_distribution_major_version' from source: facts 8454 1726882418.88381: variable 'network_provider' from source: set_fact 8454 1726882418.88404: variable 'ansible_facts' from source: unknown 8454 1726882418.89411: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 8454 1726882418.89420: when evaluation is False, skipping this task 8454 1726882418.89426: _execute() done 8454 1726882418.89744: dumping result to json 8454 1726882418.89747: done dumping result, returning 8454 1726882418.89750: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-f59f-16b9-00000000002e] 8454 1726882418.89752: sending task result for task 0affe814-3a2d-f59f-16b9-00000000002e 8454 1726882418.89826: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000002e 8454 1726882418.89829: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 8454 1726882418.89903: no more pending results, returning what we have 8454 1726882418.89907: results queue empty 8454 1726882418.89908: checking for any_errors_fatal 8454 1726882418.89914: done checking for any_errors_fatal 8454 1726882418.89916: checking for max_fail_percentage 8454 1726882418.89918: done checking for max_fail_percentage 8454 1726882418.89919: checking to see if all hosts have failed and the running result is not ok 8454 1726882418.89920: done checking to see if all hosts have failed 8454 1726882418.89921: getting the remaining hosts for this loop 8454 1726882418.89923: done getting the remaining hosts for this loop 8454 1726882418.89928: getting the next task for host managed_node3 8454 1726882418.90040: done getting next task for host managed_node3 8454 1726882418.90046: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8454 1726882418.90050: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882418.90067: getting variables 8454 1726882418.90068: in VariableManager get_vars() 8454 1726882418.90116: Calling all_inventory to load vars for managed_node3 8454 1726882418.90119: Calling groups_inventory to load vars for managed_node3 8454 1726882418.90122: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882418.90132: Calling all_plugins_play to load vars for managed_node3 8454 1726882418.90257: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882418.90265: Calling groups_plugins_play to load vars for managed_node3 8454 1726882418.99529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882419.02372: done with get_vars() 8454 1726882419.02417: done getting variables 8454 1726882419.02479: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:39 -0400 (0:00:00.307) 0:00:17.042 ****** 8454 1726882419.02516: entering _queue_task() for managed_node3/package 8454 1726882419.03061: worker is 1 (out of 1 available) 8454 1726882419.03073: exiting _queue_task() for managed_node3/package 8454 1726882419.03086: done queuing things up, now waiting for results queue to drain 8454 1726882419.03088: waiting for pending results... 8454 1726882419.03205: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8454 1726882419.03382: in run() - task 0affe814-3a2d-f59f-16b9-00000000002f 8454 1726882419.03404: variable 'ansible_search_path' from source: unknown 8454 1726882419.03432: variable 'ansible_search_path' from source: unknown 8454 1726882419.03469: calling self._execute() 8454 1726882419.03653: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882419.03656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882419.03659: variable 'omit' from source: magic vars 8454 1726882419.04070: variable 'ansible_distribution_major_version' from source: facts 8454 1726882419.04094: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882419.04256: variable 'network_state' from source: role '' defaults 8454 1726882419.04274: Evaluated conditional (network_state != {}): False 8454 1726882419.04282: when evaluation is False, skipping this task 8454 1726882419.04290: _execute() done 8454 1726882419.04301: dumping result to json 8454 1726882419.04312: done dumping result, returning 8454 1726882419.04326: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-f59f-16b9-00000000002f] 8454 1726882419.04340: sending task result for task 0affe814-3a2d-f59f-16b9-00000000002f 8454 1726882419.04584: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000002f 8454 1726882419.04588: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882419.04647: no more pending results, returning what we have 8454 1726882419.04651: results queue empty 8454 1726882419.04653: checking for any_errors_fatal 8454 1726882419.04663: done checking for any_errors_fatal 8454 1726882419.04664: checking for max_fail_percentage 8454 1726882419.04666: done checking for max_fail_percentage 8454 1726882419.04667: checking to see if all hosts have failed and the running result is not ok 8454 1726882419.04668: done checking to see if all hosts have failed 8454 1726882419.04669: getting the remaining hosts for this loop 8454 1726882419.04672: done getting the remaining hosts for this loop 8454 1726882419.04676: getting the next task for host managed_node3 8454 1726882419.04685: done getting next task for host managed_node3 8454 1726882419.04689: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8454 1726882419.04693: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882419.04713: getting variables 8454 1726882419.04716: in VariableManager get_vars() 8454 1726882419.04766: Calling all_inventory to load vars for managed_node3 8454 1726882419.04770: Calling groups_inventory to load vars for managed_node3 8454 1726882419.04773: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882419.04787: Calling all_plugins_play to load vars for managed_node3 8454 1726882419.04791: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882419.04795: Calling groups_plugins_play to load vars for managed_node3 8454 1726882419.07253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882419.10332: done with get_vars() 8454 1726882419.10382: done getting variables 8454 1726882419.10489: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:39 -0400 (0:00:00.080) 0:00:17.123 ****** 8454 1726882419.10546: entering _queue_task() for managed_node3/package 8454 1726882419.11062: worker is 1 (out of 1 available) 8454 1726882419.11086: exiting _queue_task() for managed_node3/package 8454 1726882419.11102: done queuing things up, now waiting for results queue to drain 8454 1726882419.11103: waiting for pending results... 8454 1726882419.11455: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8454 1726882419.11517: in run() - task 0affe814-3a2d-f59f-16b9-000000000030 8454 1726882419.11545: variable 'ansible_search_path' from source: unknown 8454 1726882419.11556: variable 'ansible_search_path' from source: unknown 8454 1726882419.11610: calling self._execute() 8454 1726882419.11722: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882419.11738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882419.11755: variable 'omit' from source: magic vars 8454 1726882419.12210: variable 'ansible_distribution_major_version' from source: facts 8454 1726882419.12237: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882419.12396: variable 'network_state' from source: role '' defaults 8454 1726882419.12416: Evaluated conditional (network_state != {}): False 8454 1726882419.12426: when evaluation is False, skipping this task 8454 1726882419.12438: _execute() done 8454 1726882419.12451: dumping result to json 8454 1726882419.12461: done dumping result, returning 8454 1726882419.12540: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-f59f-16b9-000000000030] 8454 1726882419.12544: sending task result for task 0affe814-3a2d-f59f-16b9-000000000030 8454 1726882419.12629: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000030 8454 1726882419.12633: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882419.12694: no more pending results, returning what we have 8454 1726882419.12699: results queue empty 8454 1726882419.12700: checking for any_errors_fatal 8454 1726882419.12710: done checking for any_errors_fatal 8454 1726882419.12711: checking for max_fail_percentage 8454 1726882419.12714: done checking for max_fail_percentage 8454 1726882419.12715: checking to see if all hosts have failed and the running result is not ok 8454 1726882419.12716: done checking to see if all hosts have failed 8454 1726882419.12717: getting the remaining hosts for this loop 8454 1726882419.12719: done getting the remaining hosts for this loop 8454 1726882419.12724: getting the next task for host managed_node3 8454 1726882419.12732: done getting next task for host managed_node3 8454 1726882419.12738: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8454 1726882419.12743: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882419.12954: getting variables 8454 1726882419.12955: in VariableManager get_vars() 8454 1726882419.12997: Calling all_inventory to load vars for managed_node3 8454 1726882419.13001: Calling groups_inventory to load vars for managed_node3 8454 1726882419.13004: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882419.13014: Calling all_plugins_play to load vars for managed_node3 8454 1726882419.13017: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882419.13021: Calling groups_plugins_play to load vars for managed_node3 8454 1726882419.15174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882419.18002: done with get_vars() 8454 1726882419.18046: done getting variables 8454 1726882419.18168: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:39 -0400 (0:00:00.076) 0:00:17.199 ****** 8454 1726882419.18207: entering _queue_task() for managed_node3/service 8454 1726882419.18210: Creating lock for service 8454 1726882419.18669: worker is 1 (out of 1 available) 8454 1726882419.18681: exiting _queue_task() for managed_node3/service 8454 1726882419.18696: done queuing things up, now waiting for results queue to drain 8454 1726882419.18697: waiting for pending results... 8454 1726882419.19056: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8454 1726882419.19061: in run() - task 0affe814-3a2d-f59f-16b9-000000000031 8454 1726882419.19080: variable 'ansible_search_path' from source: unknown 8454 1726882419.19089: variable 'ansible_search_path' from source: unknown 8454 1726882419.19138: calling self._execute() 8454 1726882419.19241: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882419.19262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882419.19280: variable 'omit' from source: magic vars 8454 1726882419.19739: variable 'ansible_distribution_major_version' from source: facts 8454 1726882419.19759: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882419.19920: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882419.20237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882419.23166: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882419.23246: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882419.23313: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882419.23367: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882419.23410: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882419.23516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882419.23641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882419.23646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.23664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882419.23689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882419.23759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882419.23798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882419.23837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.23898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882419.23921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882419.23986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882419.24094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882419.24098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.24116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882419.24140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882419.24374: variable 'network_connections' from source: task vars 8454 1726882419.24395: variable 'controller_profile' from source: play vars 8454 1726882419.24492: variable 'controller_profile' from source: play vars 8454 1726882419.24509: variable 'controller_device' from source: play vars 8454 1726882419.24594: variable 'controller_device' from source: play vars 8454 1726882419.24613: variable 'port1_profile' from source: play vars 8454 1726882419.24695: variable 'port1_profile' from source: play vars 8454 1726882419.24709: variable 'dhcp_interface1' from source: play vars 8454 1726882419.24791: variable 'dhcp_interface1' from source: play vars 8454 1726882419.24861: variable 'controller_profile' from source: play vars 8454 1726882419.24890: variable 'controller_profile' from source: play vars 8454 1726882419.24905: variable 'port2_profile' from source: play vars 8454 1726882419.24989: variable 'port2_profile' from source: play vars 8454 1726882419.25003: variable 'dhcp_interface2' from source: play vars 8454 1726882419.25087: variable 'dhcp_interface2' from source: play vars 8454 1726882419.25101: variable 'controller_profile' from source: play vars 8454 1726882419.25178: variable 'controller_profile' from source: play vars 8454 1726882419.25296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882419.25487: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882419.25621: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882419.25625: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882419.25640: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882419.25695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882419.25726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882419.25767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.25805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882419.25893: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882419.26219: variable 'network_connections' from source: task vars 8454 1726882419.26439: variable 'controller_profile' from source: play vars 8454 1726882419.26443: variable 'controller_profile' from source: play vars 8454 1726882419.26445: variable 'controller_device' from source: play vars 8454 1726882419.26446: variable 'controller_device' from source: play vars 8454 1726882419.26449: variable 'port1_profile' from source: play vars 8454 1726882419.26478: variable 'port1_profile' from source: play vars 8454 1726882419.26494: variable 'dhcp_interface1' from source: play vars 8454 1726882419.26575: variable 'dhcp_interface1' from source: play vars 8454 1726882419.26587: variable 'controller_profile' from source: play vars 8454 1726882419.26658: variable 'controller_profile' from source: play vars 8454 1726882419.26677: variable 'port2_profile' from source: play vars 8454 1726882419.26753: variable 'port2_profile' from source: play vars 8454 1726882419.26766: variable 'dhcp_interface2' from source: play vars 8454 1726882419.26847: variable 'dhcp_interface2' from source: play vars 8454 1726882419.26859: variable 'controller_profile' from source: play vars 8454 1726882419.26941: variable 'controller_profile' from source: play vars 8454 1726882419.26984: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8454 1726882419.26998: when evaluation is False, skipping this task 8454 1726882419.27007: _execute() done 8454 1726882419.27014: dumping result to json 8454 1726882419.27022: done dumping result, returning 8454 1726882419.27036: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-f59f-16b9-000000000031] 8454 1726882419.27048: sending task result for task 0affe814-3a2d-f59f-16b9-000000000031 8454 1726882419.27342: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000031 8454 1726882419.27346: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8454 1726882419.27398: no more pending results, returning what we have 8454 1726882419.27402: results queue empty 8454 1726882419.27403: checking for any_errors_fatal 8454 1726882419.27411: done checking for any_errors_fatal 8454 1726882419.27412: checking for max_fail_percentage 8454 1726882419.27414: done checking for max_fail_percentage 8454 1726882419.27415: checking to see if all hosts have failed and the running result is not ok 8454 1726882419.27416: done checking to see if all hosts have failed 8454 1726882419.27417: getting the remaining hosts for this loop 8454 1726882419.27419: done getting the remaining hosts for this loop 8454 1726882419.27423: getting the next task for host managed_node3 8454 1726882419.27430: done getting next task for host managed_node3 8454 1726882419.27436: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8454 1726882419.27440: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882419.27457: getting variables 8454 1726882419.27459: in VariableManager get_vars() 8454 1726882419.27507: Calling all_inventory to load vars for managed_node3 8454 1726882419.27510: Calling groups_inventory to load vars for managed_node3 8454 1726882419.27513: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882419.27525: Calling all_plugins_play to load vars for managed_node3 8454 1726882419.27529: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882419.27533: Calling groups_plugins_play to load vars for managed_node3 8454 1726882419.29951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882419.32863: done with get_vars() 8454 1726882419.32908: done getting variables 8454 1726882419.32997: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:39 -0400 (0:00:00.148) 0:00:17.348 ****** 8454 1726882419.33075: entering _queue_task() for managed_node3/service 8454 1726882419.33499: worker is 1 (out of 1 available) 8454 1726882419.33518: exiting _queue_task() for managed_node3/service 8454 1726882419.33537: done queuing things up, now waiting for results queue to drain 8454 1726882419.33539: waiting for pending results... 8454 1726882419.33827: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8454 1726882419.33986: in run() - task 0affe814-3a2d-f59f-16b9-000000000032 8454 1726882419.34011: variable 'ansible_search_path' from source: unknown 8454 1726882419.34022: variable 'ansible_search_path' from source: unknown 8454 1726882419.34140: calling self._execute() 8454 1726882419.34181: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882419.34196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882419.34213: variable 'omit' from source: magic vars 8454 1726882419.34648: variable 'ansible_distribution_major_version' from source: facts 8454 1726882419.34668: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882419.34875: variable 'network_provider' from source: set_fact 8454 1726882419.34889: variable 'network_state' from source: role '' defaults 8454 1726882419.34908: Evaluated conditional (network_provider == "nm" or network_state != {}): True 8454 1726882419.34921: variable 'omit' from source: magic vars 8454 1726882419.34998: variable 'omit' from source: magic vars 8454 1726882419.35038: variable 'network_service_name' from source: role '' defaults 8454 1726882419.35338: variable 'network_service_name' from source: role '' defaults 8454 1726882419.35341: variable '__network_provider_setup' from source: role '' defaults 8454 1726882419.35344: variable '__network_service_name_default_nm' from source: role '' defaults 8454 1726882419.35347: variable '__network_service_name_default_nm' from source: role '' defaults 8454 1726882419.35349: variable '__network_packages_default_nm' from source: role '' defaults 8454 1726882419.35418: variable '__network_packages_default_nm' from source: role '' defaults 8454 1726882419.35723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882419.38178: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882419.38275: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882419.38323: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882419.38372: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882419.38408: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882419.38503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882419.38545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882419.38581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.38640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882419.38661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882419.38721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882419.38756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882419.38791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.38845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882419.38867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882419.39170: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8454 1726882419.39321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882419.39358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882419.39393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.39449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882419.39470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882419.39583: variable 'ansible_python' from source: facts 8454 1726882419.39612: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8454 1726882419.39716: variable '__network_wpa_supplicant_required' from source: role '' defaults 8454 1726882419.39817: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8454 1726882419.39981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882419.40016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882419.40053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.40107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882419.40130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882419.40194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882419.40239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882419.40274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.40327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882419.40353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882419.40640: variable 'network_connections' from source: task vars 8454 1726882419.40644: variable 'controller_profile' from source: play vars 8454 1726882419.40647: variable 'controller_profile' from source: play vars 8454 1726882419.40649: variable 'controller_device' from source: play vars 8454 1726882419.40737: variable 'controller_device' from source: play vars 8454 1726882419.40758: variable 'port1_profile' from source: play vars 8454 1726882419.40850: variable 'port1_profile' from source: play vars 8454 1726882419.40868: variable 'dhcp_interface1' from source: play vars 8454 1726882419.40960: variable 'dhcp_interface1' from source: play vars 8454 1726882419.40976: variable 'controller_profile' from source: play vars 8454 1726882419.41068: variable 'controller_profile' from source: play vars 8454 1726882419.41086: variable 'port2_profile' from source: play vars 8454 1726882419.41178: variable 'port2_profile' from source: play vars 8454 1726882419.41197: variable 'dhcp_interface2' from source: play vars 8454 1726882419.41292: variable 'dhcp_interface2' from source: play vars 8454 1726882419.41314: variable 'controller_profile' from source: play vars 8454 1726882419.41404: variable 'controller_profile' from source: play vars 8454 1726882419.41531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882419.41746: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882419.41941: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882419.41945: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882419.41947: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882419.41999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882419.42043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882419.42090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882419.42141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882419.42205: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882419.42581: variable 'network_connections' from source: task vars 8454 1726882419.42596: variable 'controller_profile' from source: play vars 8454 1726882419.42687: variable 'controller_profile' from source: play vars 8454 1726882419.42707: variable 'controller_device' from source: play vars 8454 1726882419.42800: variable 'controller_device' from source: play vars 8454 1726882419.42821: variable 'port1_profile' from source: play vars 8454 1726882419.42913: variable 'port1_profile' from source: play vars 8454 1726882419.42932: variable 'dhcp_interface1' from source: play vars 8454 1726882419.43025: variable 'dhcp_interface1' from source: play vars 8454 1726882419.43046: variable 'controller_profile' from source: play vars 8454 1726882419.43135: variable 'controller_profile' from source: play vars 8454 1726882419.43157: variable 'port2_profile' from source: play vars 8454 1726882419.43245: variable 'port2_profile' from source: play vars 8454 1726882419.43263: variable 'dhcp_interface2' from source: play vars 8454 1726882419.43354: variable 'dhcp_interface2' from source: play vars 8454 1726882419.43371: variable 'controller_profile' from source: play vars 8454 1726882419.43460: variable 'controller_profile' from source: play vars 8454 1726882419.43524: variable '__network_packages_default_wireless' from source: role '' defaults 8454 1726882419.43628: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882419.44016: variable 'network_connections' from source: task vars 8454 1726882419.44028: variable 'controller_profile' from source: play vars 8454 1726882419.44117: variable 'controller_profile' from source: play vars 8454 1726882419.44240: variable 'controller_device' from source: play vars 8454 1726882419.44244: variable 'controller_device' from source: play vars 8454 1726882419.44246: variable 'port1_profile' from source: play vars 8454 1726882419.44321: variable 'port1_profile' from source: play vars 8454 1726882419.44338: variable 'dhcp_interface1' from source: play vars 8454 1726882419.44423: variable 'dhcp_interface1' from source: play vars 8454 1726882419.44440: variable 'controller_profile' from source: play vars 8454 1726882419.44526: variable 'controller_profile' from source: play vars 8454 1726882419.44543: variable 'port2_profile' from source: play vars 8454 1726882419.44627: variable 'port2_profile' from source: play vars 8454 1726882419.44645: variable 'dhcp_interface2' from source: play vars 8454 1726882419.44731: variable 'dhcp_interface2' from source: play vars 8454 1726882419.44750: variable 'controller_profile' from source: play vars 8454 1726882419.44838: variable 'controller_profile' from source: play vars 8454 1726882419.44876: variable '__network_packages_default_team' from source: role '' defaults 8454 1726882419.44979: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882419.45568: variable 'network_connections' from source: task vars 8454 1726882419.45586: variable 'controller_profile' from source: play vars 8454 1726882419.45658: variable 'controller_profile' from source: play vars 8454 1726882419.45674: variable 'controller_device' from source: play vars 8454 1726882419.45731: variable 'controller_device' from source: play vars 8454 1726882419.45742: variable 'port1_profile' from source: play vars 8454 1726882419.45804: variable 'port1_profile' from source: play vars 8454 1726882419.45811: variable 'dhcp_interface1' from source: play vars 8454 1726882419.45875: variable 'dhcp_interface1' from source: play vars 8454 1726882419.45884: variable 'controller_profile' from source: play vars 8454 1726882419.45944: variable 'controller_profile' from source: play vars 8454 1726882419.45951: variable 'port2_profile' from source: play vars 8454 1726882419.46015: variable 'port2_profile' from source: play vars 8454 1726882419.46018: variable 'dhcp_interface2' from source: play vars 8454 1726882419.46077: variable 'dhcp_interface2' from source: play vars 8454 1726882419.46091: variable 'controller_profile' from source: play vars 8454 1726882419.46147: variable 'controller_profile' from source: play vars 8454 1726882419.46206: variable '__network_service_name_default_initscripts' from source: role '' defaults 8454 1726882419.46259: variable '__network_service_name_default_initscripts' from source: role '' defaults 8454 1726882419.46266: variable '__network_packages_default_initscripts' from source: role '' defaults 8454 1726882419.46319: variable '__network_packages_default_initscripts' from source: role '' defaults 8454 1726882419.46507: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8454 1726882419.46913: variable 'network_connections' from source: task vars 8454 1726882419.46918: variable 'controller_profile' from source: play vars 8454 1726882419.46972: variable 'controller_profile' from source: play vars 8454 1726882419.46979: variable 'controller_device' from source: play vars 8454 1726882419.47031: variable 'controller_device' from source: play vars 8454 1726882419.47042: variable 'port1_profile' from source: play vars 8454 1726882419.47096: variable 'port1_profile' from source: play vars 8454 1726882419.47103: variable 'dhcp_interface1' from source: play vars 8454 1726882419.47153: variable 'dhcp_interface1' from source: play vars 8454 1726882419.47159: variable 'controller_profile' from source: play vars 8454 1726882419.47212: variable 'controller_profile' from source: play vars 8454 1726882419.47220: variable 'port2_profile' from source: play vars 8454 1726882419.47271: variable 'port2_profile' from source: play vars 8454 1726882419.47278: variable 'dhcp_interface2' from source: play vars 8454 1726882419.47332: variable 'dhcp_interface2' from source: play vars 8454 1726882419.47341: variable 'controller_profile' from source: play vars 8454 1726882419.47393: variable 'controller_profile' from source: play vars 8454 1726882419.47403: variable 'ansible_distribution' from source: facts 8454 1726882419.47406: variable '__network_rh_distros' from source: role '' defaults 8454 1726882419.47411: variable 'ansible_distribution_major_version' from source: facts 8454 1726882419.47450: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8454 1726882419.47671: variable 'ansible_distribution' from source: facts 8454 1726882419.47675: variable '__network_rh_distros' from source: role '' defaults 8454 1726882419.47677: variable 'ansible_distribution_major_version' from source: facts 8454 1726882419.47680: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8454 1726882419.48056: variable 'ansible_distribution' from source: facts 8454 1726882419.48060: variable '__network_rh_distros' from source: role '' defaults 8454 1726882419.48063: variable 'ansible_distribution_major_version' from source: facts 8454 1726882419.48065: variable 'network_provider' from source: set_fact 8454 1726882419.48067: variable 'omit' from source: magic vars 8454 1726882419.48070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882419.48093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882419.48115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882419.48138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882419.48160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882419.48207: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882419.48211: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882419.48216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882419.48353: Set connection var ansible_connection to ssh 8454 1726882419.48365: Set connection var ansible_shell_executable to /bin/sh 8454 1726882419.48372: Set connection var ansible_timeout to 10 8454 1726882419.48375: Set connection var ansible_shell_type to sh 8454 1726882419.48396: Set connection var ansible_pipelining to False 8454 1726882419.48408: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882419.48440: variable 'ansible_shell_executable' from source: unknown 8454 1726882419.48445: variable 'ansible_connection' from source: unknown 8454 1726882419.48448: variable 'ansible_module_compression' from source: unknown 8454 1726882419.48480: variable 'ansible_shell_type' from source: unknown 8454 1726882419.48484: variable 'ansible_shell_executable' from source: unknown 8454 1726882419.48486: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882419.48489: variable 'ansible_pipelining' from source: unknown 8454 1726882419.48492: variable 'ansible_timeout' from source: unknown 8454 1726882419.48494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882419.48636: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882419.48647: variable 'omit' from source: magic vars 8454 1726882419.48715: starting attempt loop 8454 1726882419.48718: running the handler 8454 1726882419.48762: variable 'ansible_facts' from source: unknown 8454 1726882419.49535: _low_level_execute_command(): starting 8454 1726882419.49543: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882419.50039: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882419.50073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.50076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882419.50082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.50133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882419.50138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882419.50272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882419.52168: stdout chunk (state=3): >>>/root <<< 8454 1726882419.52273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882419.52327: stderr chunk (state=3): >>><<< 8454 1726882419.52331: stdout chunk (state=3): >>><<< 8454 1726882419.52353: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882419.52366: _low_level_execute_command(): starting 8454 1726882419.52373: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293 `" && echo ansible-tmp-1726882419.5235426-9106-237199641157293="` echo /root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293 `" ) && sleep 0' 8454 1726882419.52818: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882419.52840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882419.52844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.52847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882419.52849: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882419.52874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.52926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882419.52930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882419.53052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882419.55146: stdout chunk (state=3): >>>ansible-tmp-1726882419.5235426-9106-237199641157293=/root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293 <<< 8454 1726882419.55355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882419.55461: stderr chunk (state=3): >>><<< 8454 1726882419.55465: stdout chunk (state=3): >>><<< 8454 1726882419.55524: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882419.5235426-9106-237199641157293=/root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882419.55532: variable 'ansible_module_compression' from source: unknown 8454 1726882419.55601: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 8454 1726882419.55605: ANSIBALLZ: Acquiring lock 8454 1726882419.55608: ANSIBALLZ: Lock acquired: 140055527345136 8454 1726882419.55610: ANSIBALLZ: Creating module 8454 1726882419.79802: ANSIBALLZ: Writing module into payload 8454 1726882419.79943: ANSIBALLZ: Writing module 8454 1726882419.79969: ANSIBALLZ: Renaming module 8454 1726882419.79974: ANSIBALLZ: Done creating module 8454 1726882419.80006: variable 'ansible_facts' from source: unknown 8454 1726882419.80149: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/AnsiballZ_systemd.py 8454 1726882419.80275: Sending initial data 8454 1726882419.80279: Sent initial data (154 bytes) 8454 1726882419.80769: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882419.80775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.80780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882419.80782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882419.80785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.80840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882419.80844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882419.80848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882419.80966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882419.82770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 8454 1726882419.82775: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882419.82892: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882419.83016: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpievr6ks8 /root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/AnsiballZ_systemd.py <<< 8454 1726882419.83019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/AnsiballZ_systemd.py" <<< 8454 1726882419.83158: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpievr6ks8" to remote "/root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/AnsiballZ_systemd.py" <<< 8454 1726882419.85826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882419.85942: stderr chunk (state=3): >>><<< 8454 1726882419.85946: stdout chunk (state=3): >>><<< 8454 1726882419.85949: done transferring module to remote 8454 1726882419.85951: _low_level_execute_command(): starting 8454 1726882419.85954: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/ /root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/AnsiballZ_systemd.py && sleep 0' 8454 1726882419.86435: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882419.86440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.86442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882419.86445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.86497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882419.86504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882419.86620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882419.88661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882419.88664: stdout chunk (state=3): >>><<< 8454 1726882419.88666: stderr chunk (state=3): >>><<< 8454 1726882419.88782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882419.88785: _low_level_execute_command(): starting 8454 1726882419.88788: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/AnsiballZ_systemd.py && sleep 0' 8454 1726882419.89397: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882419.89494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882419.89515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882419.89549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882419.89883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882420.22672: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "653", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ExecMainStartTimestampMonotonic": "18094121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "653", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11665408", "MemoryAvailable": "infinity", "CPUUsageNSec": "653238000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "inf<<< 8454 1726882420.22720: stdout chunk (state=3): >>>inity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target multi-user.target network.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service dbus.socket basic.target network-pre.target system.slice sysinit.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:41 EDT", "StateChangeTimestampMonotonic": "505811565", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:33 EDT", "InactiveExitTimestampMonotonic": "18094364", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:34 EDT", "ActiveEnterTimestampMonotonic": "18531095", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ConditionTimestampMonotonic": "18086405", "AssertTimestamp": "Fri 2024-09-20 21:24:33 EDT", "AssertTimestampMonotonic": "18086408", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1c8adba7025b47b4adeb74e368331c9f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8454 1726882420.24670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882420.24837: stderr chunk (state=3): >>><<< 8454 1726882420.24841: stdout chunk (state=3): >>><<< 8454 1726882420.24846: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "653", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ExecMainStartTimestampMonotonic": "18094121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "653", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11665408", "MemoryAvailable": "infinity", "CPUUsageNSec": "653238000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target multi-user.target network.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service dbus.socket basic.target network-pre.target system.slice sysinit.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:41 EDT", "StateChangeTimestampMonotonic": "505811565", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:33 EDT", "InactiveExitTimestampMonotonic": "18094364", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:34 EDT", "ActiveEnterTimestampMonotonic": "18531095", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ConditionTimestampMonotonic": "18086405", "AssertTimestamp": "Fri 2024-09-20 21:24:33 EDT", "AssertTimestampMonotonic": "18086408", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1c8adba7025b47b4adeb74e368331c9f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882420.24947: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882420.24965: _low_level_execute_command(): starting 8454 1726882420.24969: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882419.5235426-9106-237199641157293/ > /dev/null 2>&1 && sleep 0' 8454 1726882420.25429: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882420.25432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882420.25437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882420.25440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882420.25442: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.25486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882420.25505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882420.25617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882420.27737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882420.27740: stdout chunk (state=3): >>><<< 8454 1726882420.27743: stderr chunk (state=3): >>><<< 8454 1726882420.27763: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882420.27824: handler run complete 8454 1726882420.27855: attempt loop complete, returning result 8454 1726882420.27864: _execute() done 8454 1726882420.27867: dumping result to json 8454 1726882420.27883: done dumping result, returning 8454 1726882420.27892: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-f59f-16b9-000000000032] 8454 1726882420.27898: sending task result for task 0affe814-3a2d-f59f-16b9-000000000032 8454 1726882420.28251: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000032 8454 1726882420.28254: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882420.28310: no more pending results, returning what we have 8454 1726882420.28314: results queue empty 8454 1726882420.28315: checking for any_errors_fatal 8454 1726882420.28320: done checking for any_errors_fatal 8454 1726882420.28321: checking for max_fail_percentage 8454 1726882420.28323: done checking for max_fail_percentage 8454 1726882420.28324: checking to see if all hosts have failed and the running result is not ok 8454 1726882420.28325: done checking to see if all hosts have failed 8454 1726882420.28326: getting the remaining hosts for this loop 8454 1726882420.28328: done getting the remaining hosts for this loop 8454 1726882420.28332: getting the next task for host managed_node3 8454 1726882420.28340: done getting next task for host managed_node3 8454 1726882420.28344: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8454 1726882420.28347: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882420.28359: getting variables 8454 1726882420.28361: in VariableManager get_vars() 8454 1726882420.28403: Calling all_inventory to load vars for managed_node3 8454 1726882420.28406: Calling groups_inventory to load vars for managed_node3 8454 1726882420.28409: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882420.28420: Calling all_plugins_play to load vars for managed_node3 8454 1726882420.28423: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882420.28427: Calling groups_plugins_play to load vars for managed_node3 8454 1726882420.30317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882420.32206: done with get_vars() 8454 1726882420.32241: done getting variables 8454 1726882420.32313: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:40 -0400 (0:00:00.993) 0:00:18.341 ****** 8454 1726882420.32354: entering _queue_task() for managed_node3/service 8454 1726882420.32650: worker is 1 (out of 1 available) 8454 1726882420.32664: exiting _queue_task() for managed_node3/service 8454 1726882420.32679: done queuing things up, now waiting for results queue to drain 8454 1726882420.32680: waiting for pending results... 8454 1726882420.32881: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8454 1726882420.32992: in run() - task 0affe814-3a2d-f59f-16b9-000000000033 8454 1726882420.33005: variable 'ansible_search_path' from source: unknown 8454 1726882420.33011: variable 'ansible_search_path' from source: unknown 8454 1726882420.33047: calling self._execute() 8454 1726882420.33121: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882420.33125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882420.33137: variable 'omit' from source: magic vars 8454 1726882420.33468: variable 'ansible_distribution_major_version' from source: facts 8454 1726882420.33480: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882420.33583: variable 'network_provider' from source: set_fact 8454 1726882420.33591: Evaluated conditional (network_provider == "nm"): True 8454 1726882420.33673: variable '__network_wpa_supplicant_required' from source: role '' defaults 8454 1726882420.33749: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8454 1726882420.33898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882420.35564: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882420.35619: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882420.35653: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882420.35686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882420.35712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882420.35791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882420.35819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882420.35841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882420.35875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882420.35891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882420.35938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882420.35958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882420.35979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882420.36012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882420.36027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882420.36064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882420.36087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882420.36107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882420.36143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882420.36156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882420.36271: variable 'network_connections' from source: task vars 8454 1726882420.36284: variable 'controller_profile' from source: play vars 8454 1726882420.36337: variable 'controller_profile' from source: play vars 8454 1726882420.36348: variable 'controller_device' from source: play vars 8454 1726882420.36401: variable 'controller_device' from source: play vars 8454 1726882420.36411: variable 'port1_profile' from source: play vars 8454 1726882420.36467: variable 'port1_profile' from source: play vars 8454 1726882420.36470: variable 'dhcp_interface1' from source: play vars 8454 1726882420.36524: variable 'dhcp_interface1' from source: play vars 8454 1726882420.36531: variable 'controller_profile' from source: play vars 8454 1726882420.36585: variable 'controller_profile' from source: play vars 8454 1726882420.36592: variable 'port2_profile' from source: play vars 8454 1726882420.36643: variable 'port2_profile' from source: play vars 8454 1726882420.36650: variable 'dhcp_interface2' from source: play vars 8454 1726882420.36703: variable 'dhcp_interface2' from source: play vars 8454 1726882420.36710: variable 'controller_profile' from source: play vars 8454 1726882420.36767: variable 'controller_profile' from source: play vars 8454 1726882420.36831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882420.36962: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882420.36996: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882420.37026: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882420.37054: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882420.37092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882420.37110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882420.37136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882420.37162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882420.37208: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882420.37423: variable 'network_connections' from source: task vars 8454 1726882420.37427: variable 'controller_profile' from source: play vars 8454 1726882420.37482: variable 'controller_profile' from source: play vars 8454 1726882420.37492: variable 'controller_device' from source: play vars 8454 1726882420.37543: variable 'controller_device' from source: play vars 8454 1726882420.37563: variable 'port1_profile' from source: play vars 8454 1726882420.37609: variable 'port1_profile' from source: play vars 8454 1726882420.37616: variable 'dhcp_interface1' from source: play vars 8454 1726882420.37670: variable 'dhcp_interface1' from source: play vars 8454 1726882420.37677: variable 'controller_profile' from source: play vars 8454 1726882420.37730: variable 'controller_profile' from source: play vars 8454 1726882420.37737: variable 'port2_profile' from source: play vars 8454 1726882420.37792: variable 'port2_profile' from source: play vars 8454 1726882420.37799: variable 'dhcp_interface2' from source: play vars 8454 1726882420.37852: variable 'dhcp_interface2' from source: play vars 8454 1726882420.37859: variable 'controller_profile' from source: play vars 8454 1726882420.37914: variable 'controller_profile' from source: play vars 8454 1726882420.37949: Evaluated conditional (__network_wpa_supplicant_required): False 8454 1726882420.37953: when evaluation is False, skipping this task 8454 1726882420.37956: _execute() done 8454 1726882420.37961: dumping result to json 8454 1726882420.37964: done dumping result, returning 8454 1726882420.37972: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-f59f-16b9-000000000033] 8454 1726882420.37977: sending task result for task 0affe814-3a2d-f59f-16b9-000000000033 8454 1726882420.38076: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000033 8454 1726882420.38079: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 8454 1726882420.38131: no more pending results, returning what we have 8454 1726882420.38137: results queue empty 8454 1726882420.38138: checking for any_errors_fatal 8454 1726882420.38164: done checking for any_errors_fatal 8454 1726882420.38165: checking for max_fail_percentage 8454 1726882420.38167: done checking for max_fail_percentage 8454 1726882420.38168: checking to see if all hosts have failed and the running result is not ok 8454 1726882420.38169: done checking to see if all hosts have failed 8454 1726882420.38170: getting the remaining hosts for this loop 8454 1726882420.38171: done getting the remaining hosts for this loop 8454 1726882420.38176: getting the next task for host managed_node3 8454 1726882420.38182: done getting next task for host managed_node3 8454 1726882420.38187: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 8454 1726882420.38190: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882420.38205: getting variables 8454 1726882420.38207: in VariableManager get_vars() 8454 1726882420.38253: Calling all_inventory to load vars for managed_node3 8454 1726882420.38256: Calling groups_inventory to load vars for managed_node3 8454 1726882420.38259: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882420.38269: Calling all_plugins_play to load vars for managed_node3 8454 1726882420.38272: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882420.38275: Calling groups_plugins_play to load vars for managed_node3 8454 1726882420.39439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882420.41093: done with get_vars() 8454 1726882420.41114: done getting variables 8454 1726882420.41161: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:40 -0400 (0:00:00.088) 0:00:18.429 ****** 8454 1726882420.41188: entering _queue_task() for managed_node3/service 8454 1726882420.41410: worker is 1 (out of 1 available) 8454 1726882420.41424: exiting _queue_task() for managed_node3/service 8454 1726882420.41439: done queuing things up, now waiting for results queue to drain 8454 1726882420.41441: waiting for pending results... 8454 1726882420.41620: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 8454 1726882420.41720: in run() - task 0affe814-3a2d-f59f-16b9-000000000034 8454 1726882420.41733: variable 'ansible_search_path' from source: unknown 8454 1726882420.41739: variable 'ansible_search_path' from source: unknown 8454 1726882420.41772: calling self._execute() 8454 1726882420.41846: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882420.41852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882420.41863: variable 'omit' from source: magic vars 8454 1726882420.42170: variable 'ansible_distribution_major_version' from source: facts 8454 1726882420.42180: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882420.42281: variable 'network_provider' from source: set_fact 8454 1726882420.42289: Evaluated conditional (network_provider == "initscripts"): False 8454 1726882420.42292: when evaluation is False, skipping this task 8454 1726882420.42295: _execute() done 8454 1726882420.42300: dumping result to json 8454 1726882420.42305: done dumping result, returning 8454 1726882420.42312: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-f59f-16b9-000000000034] 8454 1726882420.42318: sending task result for task 0affe814-3a2d-f59f-16b9-000000000034 8454 1726882420.42410: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000034 8454 1726882420.42413: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882420.42475: no more pending results, returning what we have 8454 1726882420.42479: results queue empty 8454 1726882420.42480: checking for any_errors_fatal 8454 1726882420.42485: done checking for any_errors_fatal 8454 1726882420.42486: checking for max_fail_percentage 8454 1726882420.42489: done checking for max_fail_percentage 8454 1726882420.42490: checking to see if all hosts have failed and the running result is not ok 8454 1726882420.42491: done checking to see if all hosts have failed 8454 1726882420.42492: getting the remaining hosts for this loop 8454 1726882420.42493: done getting the remaining hosts for this loop 8454 1726882420.42497: getting the next task for host managed_node3 8454 1726882420.42503: done getting next task for host managed_node3 8454 1726882420.42507: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8454 1726882420.42510: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882420.42524: getting variables 8454 1726882420.42525: in VariableManager get_vars() 8454 1726882420.42565: Calling all_inventory to load vars for managed_node3 8454 1726882420.42569: Calling groups_inventory to load vars for managed_node3 8454 1726882420.42571: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882420.42579: Calling all_plugins_play to load vars for managed_node3 8454 1726882420.42581: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882420.42584: Calling groups_plugins_play to load vars for managed_node3 8454 1726882420.43712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882420.45244: done with get_vars() 8454 1726882420.45264: done getting variables 8454 1726882420.45313: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:40 -0400 (0:00:00.041) 0:00:18.471 ****** 8454 1726882420.45340: entering _queue_task() for managed_node3/copy 8454 1726882420.45538: worker is 1 (out of 1 available) 8454 1726882420.45551: exiting _queue_task() for managed_node3/copy 8454 1726882420.45565: done queuing things up, now waiting for results queue to drain 8454 1726882420.45566: waiting for pending results... 8454 1726882420.45737: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8454 1726882420.45835: in run() - task 0affe814-3a2d-f59f-16b9-000000000035 8454 1726882420.45848: variable 'ansible_search_path' from source: unknown 8454 1726882420.45852: variable 'ansible_search_path' from source: unknown 8454 1726882420.45881: calling self._execute() 8454 1726882420.45958: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882420.45965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882420.45975: variable 'omit' from source: magic vars 8454 1726882420.46276: variable 'ansible_distribution_major_version' from source: facts 8454 1726882420.46290: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882420.46389: variable 'network_provider' from source: set_fact 8454 1726882420.46393: Evaluated conditional (network_provider == "initscripts"): False 8454 1726882420.46397: when evaluation is False, skipping this task 8454 1726882420.46402: _execute() done 8454 1726882420.46405: dumping result to json 8454 1726882420.46410: done dumping result, returning 8454 1726882420.46419: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-f59f-16b9-000000000035] 8454 1726882420.46424: sending task result for task 0affe814-3a2d-f59f-16b9-000000000035 8454 1726882420.46525: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000035 8454 1726882420.46527: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8454 1726882420.46593: no more pending results, returning what we have 8454 1726882420.46596: results queue empty 8454 1726882420.46597: checking for any_errors_fatal 8454 1726882420.46601: done checking for any_errors_fatal 8454 1726882420.46602: checking for max_fail_percentage 8454 1726882420.46604: done checking for max_fail_percentage 8454 1726882420.46605: checking to see if all hosts have failed and the running result is not ok 8454 1726882420.46605: done checking to see if all hosts have failed 8454 1726882420.46606: getting the remaining hosts for this loop 8454 1726882420.46608: done getting the remaining hosts for this loop 8454 1726882420.46611: getting the next task for host managed_node3 8454 1726882420.46617: done getting next task for host managed_node3 8454 1726882420.46621: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8454 1726882420.46624: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882420.46640: getting variables 8454 1726882420.46642: in VariableManager get_vars() 8454 1726882420.46670: Calling all_inventory to load vars for managed_node3 8454 1726882420.46672: Calling groups_inventory to load vars for managed_node3 8454 1726882420.46674: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882420.46681: Calling all_plugins_play to load vars for managed_node3 8454 1726882420.46684: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882420.46686: Calling groups_plugins_play to load vars for managed_node3 8454 1726882420.47923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882420.49440: done with get_vars() 8454 1726882420.49459: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:40 -0400 (0:00:00.041) 0:00:18.512 ****** 8454 1726882420.49524: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 8454 1726882420.49526: Creating lock for fedora.linux_system_roles.network_connections 8454 1726882420.49727: worker is 1 (out of 1 available) 8454 1726882420.49743: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 8454 1726882420.49756: done queuing things up, now waiting for results queue to drain 8454 1726882420.49758: waiting for pending results... 8454 1726882420.49922: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8454 1726882420.50014: in run() - task 0affe814-3a2d-f59f-16b9-000000000036 8454 1726882420.50026: variable 'ansible_search_path' from source: unknown 8454 1726882420.50029: variable 'ansible_search_path' from source: unknown 8454 1726882420.50061: calling self._execute() 8454 1726882420.50139: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882420.50144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882420.50158: variable 'omit' from source: magic vars 8454 1726882420.50465: variable 'ansible_distribution_major_version' from source: facts 8454 1726882420.50477: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882420.50482: variable 'omit' from source: magic vars 8454 1726882420.50534: variable 'omit' from source: magic vars 8454 1726882420.50666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882420.52308: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882420.52360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882420.52395: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882420.52427: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882420.52452: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882420.52516: variable 'network_provider' from source: set_fact 8454 1726882420.52623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882420.52659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882420.52683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882420.52714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882420.52736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882420.52798: variable 'omit' from source: magic vars 8454 1726882420.52898: variable 'omit' from source: magic vars 8454 1726882420.52989: variable 'network_connections' from source: task vars 8454 1726882420.52999: variable 'controller_profile' from source: play vars 8454 1726882420.53056: variable 'controller_profile' from source: play vars 8454 1726882420.53065: variable 'controller_device' from source: play vars 8454 1726882420.53118: variable 'controller_device' from source: play vars 8454 1726882420.53128: variable 'port1_profile' from source: play vars 8454 1726882420.53185: variable 'port1_profile' from source: play vars 8454 1726882420.53192: variable 'dhcp_interface1' from source: play vars 8454 1726882420.53243: variable 'dhcp_interface1' from source: play vars 8454 1726882420.53250: variable 'controller_profile' from source: play vars 8454 1726882420.53307: variable 'controller_profile' from source: play vars 8454 1726882420.53314: variable 'port2_profile' from source: play vars 8454 1726882420.53366: variable 'port2_profile' from source: play vars 8454 1726882420.53377: variable 'dhcp_interface2' from source: play vars 8454 1726882420.53426: variable 'dhcp_interface2' from source: play vars 8454 1726882420.53433: variable 'controller_profile' from source: play vars 8454 1726882420.53495: variable 'controller_profile' from source: play vars 8454 1726882420.53642: variable 'omit' from source: magic vars 8454 1726882420.53651: variable '__lsr_ansible_managed' from source: task vars 8454 1726882420.53704: variable '__lsr_ansible_managed' from source: task vars 8454 1726882420.53856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8454 1726882420.54033: Loaded config def from plugin (lookup/template) 8454 1726882420.54038: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 8454 1726882420.54064: File lookup term: get_ansible_managed.j2 8454 1726882420.54068: variable 'ansible_search_path' from source: unknown 8454 1726882420.54072: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 8454 1726882420.54085: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 8454 1726882420.54101: variable 'ansible_search_path' from source: unknown 8454 1726882420.59343: variable 'ansible_managed' from source: unknown 8454 1726882420.59473: variable 'omit' from source: magic vars 8454 1726882420.59498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882420.59520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882420.59537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882420.59553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882420.59563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882420.59590: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882420.59593: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882420.59597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882420.59672: Set connection var ansible_connection to ssh 8454 1726882420.59685: Set connection var ansible_shell_executable to /bin/sh 8454 1726882420.59690: Set connection var ansible_timeout to 10 8454 1726882420.59695: Set connection var ansible_shell_type to sh 8454 1726882420.59703: Set connection var ansible_pipelining to False 8454 1726882420.59712: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882420.59730: variable 'ansible_shell_executable' from source: unknown 8454 1726882420.59733: variable 'ansible_connection' from source: unknown 8454 1726882420.59737: variable 'ansible_module_compression' from source: unknown 8454 1726882420.59742: variable 'ansible_shell_type' from source: unknown 8454 1726882420.59745: variable 'ansible_shell_executable' from source: unknown 8454 1726882420.59749: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882420.59754: variable 'ansible_pipelining' from source: unknown 8454 1726882420.59757: variable 'ansible_timeout' from source: unknown 8454 1726882420.59763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882420.59866: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882420.59875: variable 'omit' from source: magic vars 8454 1726882420.59887: starting attempt loop 8454 1726882420.59890: running the handler 8454 1726882420.59901: _low_level_execute_command(): starting 8454 1726882420.59908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882420.60442: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.60446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.60448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.60451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.60512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882420.60515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882420.60520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882420.60640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882420.62489: stdout chunk (state=3): >>>/root <<< 8454 1726882420.62591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882420.62644: stderr chunk (state=3): >>><<< 8454 1726882420.62647: stdout chunk (state=3): >>><<< 8454 1726882420.62666: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882420.62677: _low_level_execute_command(): starting 8454 1726882420.62685: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370 `" && echo ansible-tmp-1726882420.6266632-9139-74794769558370="` echo /root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370 `" ) && sleep 0' 8454 1726882420.63196: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.63205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.63208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882420.63211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.63267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882420.63270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882420.63393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882420.65465: stdout chunk (state=3): >>>ansible-tmp-1726882420.6266632-9139-74794769558370=/root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370 <<< 8454 1726882420.65594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882420.65639: stderr chunk (state=3): >>><<< 8454 1726882420.65643: stdout chunk (state=3): >>><<< 8454 1726882420.65659: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882420.6266632-9139-74794769558370=/root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882420.65705: variable 'ansible_module_compression' from source: unknown 8454 1726882420.65749: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 8454 1726882420.65753: ANSIBALLZ: Acquiring lock 8454 1726882420.65756: ANSIBALLZ: Lock acquired: 140055533210464 8454 1726882420.65759: ANSIBALLZ: Creating module 8454 1726882420.83395: ANSIBALLZ: Writing module into payload 8454 1726882420.83748: ANSIBALLZ: Writing module 8454 1726882420.83774: ANSIBALLZ: Renaming module 8454 1726882420.83780: ANSIBALLZ: Done creating module 8454 1726882420.83808: variable 'ansible_facts' from source: unknown 8454 1726882420.83874: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/AnsiballZ_network_connections.py 8454 1726882420.83995: Sending initial data 8454 1726882420.83999: Sent initial data (165 bytes) 8454 1726882420.84492: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.84496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.84498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.84501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.84563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882420.84566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882420.84569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882420.84694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882420.86493: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8454 1726882420.86499: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882420.86602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882420.86716: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpmszvvcbc /root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/AnsiballZ_network_connections.py <<< 8454 1726882420.86724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/AnsiballZ_network_connections.py" <<< 8454 1726882420.86829: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpmszvvcbc" to remote "/root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/AnsiballZ_network_connections.py" <<< 8454 1726882420.88238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882420.88309: stderr chunk (state=3): >>><<< 8454 1726882420.88312: stdout chunk (state=3): >>><<< 8454 1726882420.88331: done transferring module to remote 8454 1726882420.88344: _low_level_execute_command(): starting 8454 1726882420.88350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/ /root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/AnsiballZ_network_connections.py && sleep 0' 8454 1726882420.88818: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.88821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882420.88825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.88827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.88829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.88882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882420.88885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882420.89005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882420.90977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882420.91027: stderr chunk (state=3): >>><<< 8454 1726882420.91031: stdout chunk (state=3): >>><<< 8454 1726882420.91046: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882420.91049: _low_level_execute_command(): starting 8454 1726882420.91055: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/AnsiballZ_network_connections.py && sleep 0' 8454 1726882420.91497: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.91500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.91503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882420.91505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882420.91566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882420.91569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882420.91688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882421.37508: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 8454 1726882421.39941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882421.39946: stdout chunk (state=3): >>><<< 8454 1726882421.39948: stderr chunk (state=3): >>><<< 8454 1726882421.39951: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882421.39970: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882421.39983: _low_level_execute_command(): starting 8454 1726882421.39987: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882420.6266632-9139-74794769558370/ > /dev/null 2>&1 && sleep 0' 8454 1726882421.40676: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882421.40686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882421.40708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882421.40725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882421.40818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882421.40826: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882421.40860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882421.40926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882421.40930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882421.41048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882421.43276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882421.43279: stdout chunk (state=3): >>><<< 8454 1726882421.43282: stderr chunk (state=3): >>><<< 8454 1726882421.43440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882421.43444: handler run complete 8454 1726882421.43447: attempt loop complete, returning result 8454 1726882421.43449: _execute() done 8454 1726882421.43451: dumping result to json 8454 1726882421.43454: done dumping result, returning 8454 1726882421.43456: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-f59f-16b9-000000000036] 8454 1726882421.43458: sending task result for task 0affe814-3a2d-f59f-16b9-000000000036 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665 (not-active) 8454 1726882421.44084: no more pending results, returning what we have 8454 1726882421.44090: results queue empty 8454 1726882421.44091: checking for any_errors_fatal 8454 1726882421.44099: done checking for any_errors_fatal 8454 1726882421.44100: checking for max_fail_percentage 8454 1726882421.44103: done checking for max_fail_percentage 8454 1726882421.44104: checking to see if all hosts have failed and the running result is not ok 8454 1726882421.44105: done checking to see if all hosts have failed 8454 1726882421.44106: getting the remaining hosts for this loop 8454 1726882421.44108: done getting the remaining hosts for this loop 8454 1726882421.44112: getting the next task for host managed_node3 8454 1726882421.44119: done getting next task for host managed_node3 8454 1726882421.44123: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 8454 1726882421.44127: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882421.44142: getting variables 8454 1726882421.44144: in VariableManager get_vars() 8454 1726882421.44284: Calling all_inventory to load vars for managed_node3 8454 1726882421.44288: Calling groups_inventory to load vars for managed_node3 8454 1726882421.44291: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882421.44394: Calling all_plugins_play to load vars for managed_node3 8454 1726882421.44398: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882421.44403: Calling groups_plugins_play to load vars for managed_node3 8454 1726882421.45005: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000036 8454 1726882421.45013: WORKER PROCESS EXITING 8454 1726882421.46980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882421.52033: done with get_vars() 8454 1726882421.52123: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:41 -0400 (0:00:01.027) 0:00:19.540 ****** 8454 1726882421.52328: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 8454 1726882421.52331: Creating lock for fedora.linux_system_roles.network_state 8454 1726882421.53182: worker is 1 (out of 1 available) 8454 1726882421.53197: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 8454 1726882421.53210: done queuing things up, now waiting for results queue to drain 8454 1726882421.53211: waiting for pending results... 8454 1726882421.53704: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 8454 1726882421.53851: in run() - task 0affe814-3a2d-f59f-16b9-000000000037 8454 1726882421.53868: variable 'ansible_search_path' from source: unknown 8454 1726882421.53873: variable 'ansible_search_path' from source: unknown 8454 1726882421.53924: calling self._execute() 8454 1726882421.54032: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882421.54041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882421.54053: variable 'omit' from source: magic vars 8454 1726882421.54515: variable 'ansible_distribution_major_version' from source: facts 8454 1726882421.54530: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882421.54698: variable 'network_state' from source: role '' defaults 8454 1726882421.54707: Evaluated conditional (network_state != {}): False 8454 1726882421.54710: when evaluation is False, skipping this task 8454 1726882421.54714: _execute() done 8454 1726882421.54719: dumping result to json 8454 1726882421.54724: done dumping result, returning 8454 1726882421.54736: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-f59f-16b9-000000000037] 8454 1726882421.54742: sending task result for task 0affe814-3a2d-f59f-16b9-000000000037 8454 1726882421.54964: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000037 8454 1726882421.54968: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882421.55014: no more pending results, returning what we have 8454 1726882421.55038: results queue empty 8454 1726882421.55039: checking for any_errors_fatal 8454 1726882421.55050: done checking for any_errors_fatal 8454 1726882421.55051: checking for max_fail_percentage 8454 1726882421.55052: done checking for max_fail_percentage 8454 1726882421.55053: checking to see if all hosts have failed and the running result is not ok 8454 1726882421.55054: done checking to see if all hosts have failed 8454 1726882421.55055: getting the remaining hosts for this loop 8454 1726882421.55057: done getting the remaining hosts for this loop 8454 1726882421.55061: getting the next task for host managed_node3 8454 1726882421.55067: done getting next task for host managed_node3 8454 1726882421.55071: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8454 1726882421.55074: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882421.55088: getting variables 8454 1726882421.55090: in VariableManager get_vars() 8454 1726882421.55141: Calling all_inventory to load vars for managed_node3 8454 1726882421.55145: Calling groups_inventory to load vars for managed_node3 8454 1726882421.55148: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882421.55159: Calling all_plugins_play to load vars for managed_node3 8454 1726882421.55162: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882421.55166: Calling groups_plugins_play to load vars for managed_node3 8454 1726882421.59125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882421.65782: done with get_vars() 8454 1726882421.65825: done getting variables 8454 1726882421.66110: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:41 -0400 (0:00:00.138) 0:00:19.679 ****** 8454 1726882421.66157: entering _queue_task() for managed_node3/debug 8454 1726882421.66952: worker is 1 (out of 1 available) 8454 1726882421.66967: exiting _queue_task() for managed_node3/debug 8454 1726882421.66981: done queuing things up, now waiting for results queue to drain 8454 1726882421.66982: waiting for pending results... 8454 1726882421.67440: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8454 1726882421.67886: in run() - task 0affe814-3a2d-f59f-16b9-000000000038 8454 1726882421.67891: variable 'ansible_search_path' from source: unknown 8454 1726882421.67893: variable 'ansible_search_path' from source: unknown 8454 1726882421.67897: calling self._execute() 8454 1726882421.68110: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882421.68441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882421.68446: variable 'omit' from source: magic vars 8454 1726882421.69216: variable 'ansible_distribution_major_version' from source: facts 8454 1726882421.69230: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882421.69239: variable 'omit' from source: magic vars 8454 1726882421.69431: variable 'omit' from source: magic vars 8454 1726882421.69476: variable 'omit' from source: magic vars 8454 1726882421.69529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882421.69690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882421.69713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882421.69736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882421.69875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882421.69969: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882421.69972: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882421.69975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882421.70239: Set connection var ansible_connection to ssh 8454 1726882421.70251: Set connection var ansible_shell_executable to /bin/sh 8454 1726882421.70259: Set connection var ansible_timeout to 10 8454 1726882421.70262: Set connection var ansible_shell_type to sh 8454 1726882421.70273: Set connection var ansible_pipelining to False 8454 1726882421.70282: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882421.70427: variable 'ansible_shell_executable' from source: unknown 8454 1726882421.70430: variable 'ansible_connection' from source: unknown 8454 1726882421.70435: variable 'ansible_module_compression' from source: unknown 8454 1726882421.70513: variable 'ansible_shell_type' from source: unknown 8454 1726882421.70516: variable 'ansible_shell_executable' from source: unknown 8454 1726882421.70518: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882421.70520: variable 'ansible_pipelining' from source: unknown 8454 1726882421.70522: variable 'ansible_timeout' from source: unknown 8454 1726882421.70524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882421.70732: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882421.70940: variable 'omit' from source: magic vars 8454 1726882421.70944: starting attempt loop 8454 1726882421.70947: running the handler 8454 1726882421.71168: variable '__network_connections_result' from source: set_fact 8454 1726882421.71328: handler run complete 8454 1726882421.71406: attempt loop complete, returning result 8454 1726882421.71410: _execute() done 8454 1726882421.71413: dumping result to json 8454 1726882421.71418: done dumping result, returning 8454 1726882421.71431: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-f59f-16b9-000000000038] 8454 1726882421.71460: sending task result for task 0affe814-3a2d-f59f-16b9-000000000038 8454 1726882421.71792: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000038 8454 1726882421.71796: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665 (not-active)" ] } 8454 1726882421.71873: no more pending results, returning what we have 8454 1726882421.71877: results queue empty 8454 1726882421.71878: checking for any_errors_fatal 8454 1726882421.71884: done checking for any_errors_fatal 8454 1726882421.71885: checking for max_fail_percentage 8454 1726882421.71887: done checking for max_fail_percentage 8454 1726882421.71888: checking to see if all hosts have failed and the running result is not ok 8454 1726882421.71889: done checking to see if all hosts have failed 8454 1726882421.71890: getting the remaining hosts for this loop 8454 1726882421.71892: done getting the remaining hosts for this loop 8454 1726882421.71897: getting the next task for host managed_node3 8454 1726882421.71904: done getting next task for host managed_node3 8454 1726882421.71909: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8454 1726882421.71912: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882421.71924: getting variables 8454 1726882421.71925: in VariableManager get_vars() 8454 1726882421.71968: Calling all_inventory to load vars for managed_node3 8454 1726882421.71971: Calling groups_inventory to load vars for managed_node3 8454 1726882421.71974: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882421.71984: Calling all_plugins_play to load vars for managed_node3 8454 1726882421.71988: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882421.71991: Calling groups_plugins_play to load vars for managed_node3 8454 1726882421.76084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882421.79590: done with get_vars() 8454 1726882421.79626: done getting variables 8454 1726882421.79714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:41 -0400 (0:00:00.137) 0:00:19.816 ****** 8454 1726882421.79895: entering _queue_task() for managed_node3/debug 8454 1726882421.80672: worker is 1 (out of 1 available) 8454 1726882421.80686: exiting _queue_task() for managed_node3/debug 8454 1726882421.80699: done queuing things up, now waiting for results queue to drain 8454 1726882421.80700: waiting for pending results... 8454 1726882421.81240: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8454 1726882421.81319: in run() - task 0affe814-3a2d-f59f-16b9-000000000039 8454 1726882421.81351: variable 'ansible_search_path' from source: unknown 8454 1726882421.81359: variable 'ansible_search_path' from source: unknown 8454 1726882421.81402: calling self._execute() 8454 1726882421.81510: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882421.81524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882421.81543: variable 'omit' from source: magic vars 8454 1726882421.82045: variable 'ansible_distribution_major_version' from source: facts 8454 1726882421.82102: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882421.82105: variable 'omit' from source: magic vars 8454 1726882421.82169: variable 'omit' from source: magic vars 8454 1726882421.82228: variable 'omit' from source: magic vars 8454 1726882421.82283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882421.82430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882421.82435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882421.82438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882421.82440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882421.82465: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882421.82474: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882421.82482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882421.82606: Set connection var ansible_connection to ssh 8454 1726882421.82623: Set connection var ansible_shell_executable to /bin/sh 8454 1726882421.82670: Set connection var ansible_timeout to 10 8454 1726882421.82680: Set connection var ansible_shell_type to sh 8454 1726882421.82697: Set connection var ansible_pipelining to False 8454 1726882421.82708: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882421.82900: variable 'ansible_shell_executable' from source: unknown 8454 1726882421.82930: variable 'ansible_connection' from source: unknown 8454 1726882421.83211: variable 'ansible_module_compression' from source: unknown 8454 1726882421.83215: variable 'ansible_shell_type' from source: unknown 8454 1726882421.83217: variable 'ansible_shell_executable' from source: unknown 8454 1726882421.83219: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882421.83221: variable 'ansible_pipelining' from source: unknown 8454 1726882421.83223: variable 'ansible_timeout' from source: unknown 8454 1726882421.83225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882421.83427: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882421.83453: variable 'omit' from source: magic vars 8454 1726882421.83465: starting attempt loop 8454 1726882421.83472: running the handler 8454 1726882421.83671: variable '__network_connections_result' from source: set_fact 8454 1726882421.83784: variable '__network_connections_result' from source: set_fact 8454 1726882421.84105: handler run complete 8454 1726882421.84157: attempt loop complete, returning result 8454 1726882421.84166: _execute() done 8454 1726882421.84173: dumping result to json 8454 1726882421.84185: done dumping result, returning 8454 1726882421.84199: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-f59f-16b9-000000000039] 8454 1726882421.84216: sending task result for task 0affe814-3a2d-f59f-16b9-000000000039 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, c39326a6-2860-4305-9ba8-6fb920a3fcc0 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ca811f30-7831-43b3-b534-83e0530ad93d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 70730f1a-05dc-466f-88aa-5eb27a8fb665 (not-active)" ] } } 8454 1726882421.84676: no more pending results, returning what we have 8454 1726882421.84680: results queue empty 8454 1726882421.84681: checking for any_errors_fatal 8454 1726882421.84689: done checking for any_errors_fatal 8454 1726882421.84690: checking for max_fail_percentage 8454 1726882421.84698: done checking for max_fail_percentage 8454 1726882421.84699: checking to see if all hosts have failed and the running result is not ok 8454 1726882421.84700: done checking to see if all hosts have failed 8454 1726882421.84701: getting the remaining hosts for this loop 8454 1726882421.84703: done getting the remaining hosts for this loop 8454 1726882421.84708: getting the next task for host managed_node3 8454 1726882421.84716: done getting next task for host managed_node3 8454 1726882421.84720: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8454 1726882421.84725: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882421.84855: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000039 8454 1726882421.84859: WORKER PROCESS EXITING 8454 1726882421.84866: getting variables 8454 1726882421.84867: in VariableManager get_vars() 8454 1726882421.84912: Calling all_inventory to load vars for managed_node3 8454 1726882421.84916: Calling groups_inventory to load vars for managed_node3 8454 1726882421.84919: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882421.84930: Calling all_plugins_play to load vars for managed_node3 8454 1726882421.84935: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882421.84940: Calling groups_plugins_play to load vars for managed_node3 8454 1726882421.87323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882421.92970: done with get_vars() 8454 1726882421.93011: done getting variables 8454 1726882421.93097: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:41 -0400 (0:00:00.132) 0:00:19.949 ****** 8454 1726882421.93145: entering _queue_task() for managed_node3/debug 8454 1726882421.93522: worker is 1 (out of 1 available) 8454 1726882421.93538: exiting _queue_task() for managed_node3/debug 8454 1726882421.93553: done queuing things up, now waiting for results queue to drain 8454 1726882421.93554: waiting for pending results... 8454 1726882421.93938: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8454 1726882421.94222: in run() - task 0affe814-3a2d-f59f-16b9-00000000003a 8454 1726882421.94301: variable 'ansible_search_path' from source: unknown 8454 1726882421.94316: variable 'ansible_search_path' from source: unknown 8454 1726882421.94369: calling self._execute() 8454 1726882421.94514: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882421.94565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882421.94569: variable 'omit' from source: magic vars 8454 1726882421.95022: variable 'ansible_distribution_major_version' from source: facts 8454 1726882421.95045: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882421.95224: variable 'network_state' from source: role '' defaults 8454 1726882421.95329: Evaluated conditional (network_state != {}): False 8454 1726882421.95332: when evaluation is False, skipping this task 8454 1726882421.95339: _execute() done 8454 1726882421.95342: dumping result to json 8454 1726882421.95345: done dumping result, returning 8454 1726882421.95347: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-f59f-16b9-00000000003a] 8454 1726882421.95350: sending task result for task 0affe814-3a2d-f59f-16b9-00000000003a skipping: [managed_node3] => { "false_condition": "network_state != {}" } 8454 1726882421.95489: no more pending results, returning what we have 8454 1726882421.95494: results queue empty 8454 1726882421.95496: checking for any_errors_fatal 8454 1726882421.95506: done checking for any_errors_fatal 8454 1726882421.95507: checking for max_fail_percentage 8454 1726882421.95510: done checking for max_fail_percentage 8454 1726882421.95511: checking to see if all hosts have failed and the running result is not ok 8454 1726882421.95512: done checking to see if all hosts have failed 8454 1726882421.95513: getting the remaining hosts for this loop 8454 1726882421.95515: done getting the remaining hosts for this loop 8454 1726882421.95521: getting the next task for host managed_node3 8454 1726882421.95528: done getting next task for host managed_node3 8454 1726882421.95540: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 8454 1726882421.95545: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882421.95566: getting variables 8454 1726882421.95567: in VariableManager get_vars() 8454 1726882421.95619: Calling all_inventory to load vars for managed_node3 8454 1726882421.95623: Calling groups_inventory to load vars for managed_node3 8454 1726882421.95627: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882421.95749: Calling all_plugins_play to load vars for managed_node3 8454 1726882421.95754: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882421.95843: Calling groups_plugins_play to load vars for managed_node3 8454 1726882421.96542: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000003a 8454 1726882421.96545: WORKER PROCESS EXITING 8454 1726882421.99328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882422.02328: done with get_vars() 8454 1726882422.02372: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:42 -0400 (0:00:00.093) 0:00:20.042 ****** 8454 1726882422.02516: entering _queue_task() for managed_node3/ping 8454 1726882422.02519: Creating lock for ping 8454 1726882422.02915: worker is 1 (out of 1 available) 8454 1726882422.02929: exiting _queue_task() for managed_node3/ping 8454 1726882422.02947: done queuing things up, now waiting for results queue to drain 8454 1726882422.02949: waiting for pending results... 8454 1726882422.03269: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 8454 1726882422.03439: in run() - task 0affe814-3a2d-f59f-16b9-00000000003b 8454 1726882422.03470: variable 'ansible_search_path' from source: unknown 8454 1726882422.03486: variable 'ansible_search_path' from source: unknown 8454 1726882422.03533: calling self._execute() 8454 1726882422.03647: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882422.03662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882422.03686: variable 'omit' from source: magic vars 8454 1726882422.04168: variable 'ansible_distribution_major_version' from source: facts 8454 1726882422.04190: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882422.04203: variable 'omit' from source: magic vars 8454 1726882422.04342: variable 'omit' from source: magic vars 8454 1726882422.04347: variable 'omit' from source: magic vars 8454 1726882422.04404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882422.04456: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882422.04494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882422.04522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882422.04544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882422.04597: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882422.04670: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882422.04673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882422.04757: Set connection var ansible_connection to ssh 8454 1726882422.04783: Set connection var ansible_shell_executable to /bin/sh 8454 1726882422.04802: Set connection var ansible_timeout to 10 8454 1726882422.04811: Set connection var ansible_shell_type to sh 8454 1726882422.04827: Set connection var ansible_pipelining to False 8454 1726882422.04842: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882422.04872: variable 'ansible_shell_executable' from source: unknown 8454 1726882422.04891: variable 'ansible_connection' from source: unknown 8454 1726882422.04905: variable 'ansible_module_compression' from source: unknown 8454 1726882422.04914: variable 'ansible_shell_type' from source: unknown 8454 1726882422.04997: variable 'ansible_shell_executable' from source: unknown 8454 1726882422.05000: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882422.05004: variable 'ansible_pipelining' from source: unknown 8454 1726882422.05007: variable 'ansible_timeout' from source: unknown 8454 1726882422.05009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882422.05220: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882422.05245: variable 'omit' from source: magic vars 8454 1726882422.05324: starting attempt loop 8454 1726882422.05327: running the handler 8454 1726882422.05330: _low_level_execute_command(): starting 8454 1726882422.05332: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882422.06214: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882422.06248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882422.06269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882422.06295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882422.06455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882422.08682: stdout chunk (state=3): >>>/root <<< 8454 1726882422.08686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882422.08689: stdout chunk (state=3): >>><<< 8454 1726882422.08790: stderr chunk (state=3): >>><<< 8454 1726882422.08793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882422.08796: _low_level_execute_command(): starting 8454 1726882422.08799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801 `" && echo ansible-tmp-1726882422.0872338-9175-46779016634801="` echo /root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801 `" ) && sleep 0' 8454 1726882422.09473: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882422.09497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882422.09572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882422.09696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882422.09806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882422.11924: stdout chunk (state=3): >>>ansible-tmp-1726882422.0872338-9175-46779016634801=/root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801 <<< 8454 1726882422.12086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882422.12642: stderr chunk (state=3): >>><<< 8454 1726882422.12646: stdout chunk (state=3): >>><<< 8454 1726882422.12649: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882422.0872338-9175-46779016634801=/root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882422.12652: variable 'ansible_module_compression' from source: unknown 8454 1726882422.12656: ANSIBALLZ: Using lock for ping 8454 1726882422.12658: ANSIBALLZ: Acquiring lock 8454 1726882422.12661: ANSIBALLZ: Lock acquired: 140055525096144 8454 1726882422.12663: ANSIBALLZ: Creating module 8454 1726882422.28650: ANSIBALLZ: Writing module into payload 8454 1726882422.28738: ANSIBALLZ: Writing module 8454 1726882422.28773: ANSIBALLZ: Renaming module 8454 1726882422.28790: ANSIBALLZ: Done creating module 8454 1726882422.28872: variable 'ansible_facts' from source: unknown 8454 1726882422.28903: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/AnsiballZ_ping.py 8454 1726882422.29117: Sending initial data 8454 1726882422.29120: Sent initial data (150 bytes) 8454 1726882422.29856: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882422.29902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882422.29919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882422.29944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882422.30117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882422.31954: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882422.32074: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882422.32206: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp1l6b_xyg /root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/AnsiballZ_ping.py <<< 8454 1726882422.32210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/AnsiballZ_ping.py" <<< 8454 1726882422.32305: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp1l6b_xyg" to remote "/root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/AnsiballZ_ping.py" <<< 8454 1726882422.33899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882422.33903: stderr chunk (state=3): >>><<< 8454 1726882422.33905: stdout chunk (state=3): >>><<< 8454 1726882422.33907: done transferring module to remote 8454 1726882422.33910: _low_level_execute_command(): starting 8454 1726882422.33912: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/ /root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/AnsiballZ_ping.py && sleep 0' 8454 1726882422.34551: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882422.34611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882422.34628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882422.34651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882422.34803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882422.36850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882422.36853: stdout chunk (state=3): >>><<< 8454 1726882422.36856: stderr chunk (state=3): >>><<< 8454 1726882422.36991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882422.36994: _low_level_execute_command(): starting 8454 1726882422.36997: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/AnsiballZ_ping.py && sleep 0' 8454 1726882422.37673: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882422.37693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882422.37710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882422.37746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882422.37800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882422.37911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882422.37939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882422.38111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882422.55190: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 8454 1726882422.56575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882422.56631: stderr chunk (state=3): >>><<< 8454 1726882422.56637: stdout chunk (state=3): >>><<< 8454 1726882422.56651: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882422.56673: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882422.56685: _low_level_execute_command(): starting 8454 1726882422.56693: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882422.0872338-9175-46779016634801/ > /dev/null 2>&1 && sleep 0' 8454 1726882422.57121: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882422.57159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882422.57163: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882422.57166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882422.57168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882422.57218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882422.57222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882422.57342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882422.59546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882422.59549: stdout chunk (state=3): >>><<< 8454 1726882422.59552: stderr chunk (state=3): >>><<< 8454 1726882422.59555: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882422.59561: handler run complete 8454 1726882422.59564: attempt loop complete, returning result 8454 1726882422.59566: _execute() done 8454 1726882422.59568: dumping result to json 8454 1726882422.59571: done dumping result, returning 8454 1726882422.59573: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-f59f-16b9-00000000003b] 8454 1726882422.59575: sending task result for task 0affe814-3a2d-f59f-16b9-00000000003b 8454 1726882422.59971: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000003b 8454 1726882422.59975: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 8454 1726882422.60049: no more pending results, returning what we have 8454 1726882422.60053: results queue empty 8454 1726882422.60054: checking for any_errors_fatal 8454 1726882422.60060: done checking for any_errors_fatal 8454 1726882422.60061: checking for max_fail_percentage 8454 1726882422.60063: done checking for max_fail_percentage 8454 1726882422.60064: checking to see if all hosts have failed and the running result is not ok 8454 1726882422.60065: done checking to see if all hosts have failed 8454 1726882422.60066: getting the remaining hosts for this loop 8454 1726882422.60068: done getting the remaining hosts for this loop 8454 1726882422.60072: getting the next task for host managed_node3 8454 1726882422.60083: done getting next task for host managed_node3 8454 1726882422.60091: ^ task is: TASK: meta (role_complete) 8454 1726882422.60095: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882422.60108: getting variables 8454 1726882422.60110: in VariableManager get_vars() 8454 1726882422.60162: Calling all_inventory to load vars for managed_node3 8454 1726882422.60166: Calling groups_inventory to load vars for managed_node3 8454 1726882422.60170: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882422.60183: Calling all_plugins_play to load vars for managed_node3 8454 1726882422.60187: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882422.60191: Calling groups_plugins_play to load vars for managed_node3 8454 1726882422.62870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882422.65916: done with get_vars() 8454 1726882422.65955: done getting variables 8454 1726882422.66063: done queuing things up, now waiting for results queue to drain 8454 1726882422.66066: results queue empty 8454 1726882422.66067: checking for any_errors_fatal 8454 1726882422.66071: done checking for any_errors_fatal 8454 1726882422.66072: checking for max_fail_percentage 8454 1726882422.66073: done checking for max_fail_percentage 8454 1726882422.66074: checking to see if all hosts have failed and the running result is not ok 8454 1726882422.66075: done checking to see if all hosts have failed 8454 1726882422.66076: getting the remaining hosts for this loop 8454 1726882422.66077: done getting the remaining hosts for this loop 8454 1726882422.66085: getting the next task for host managed_node3 8454 1726882422.66093: done getting next task for host managed_node3 8454 1726882422.66100: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8454 1726882422.66103: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882422.66106: getting variables 8454 1726882422.66108: in VariableManager get_vars() 8454 1726882422.66125: Calling all_inventory to load vars for managed_node3 8454 1726882422.66128: Calling groups_inventory to load vars for managed_node3 8454 1726882422.66131: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882422.66139: Calling all_plugins_play to load vars for managed_node3 8454 1726882422.66142: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882422.66146: Calling groups_plugins_play to load vars for managed_node3 8454 1726882422.68219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882422.71407: done with get_vars() 8454 1726882422.71452: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:42 -0400 (0:00:00.690) 0:00:20.733 ****** 8454 1726882422.71558: entering _queue_task() for managed_node3/include_tasks 8454 1726882422.71955: worker is 1 (out of 1 available) 8454 1726882422.72047: exiting _queue_task() for managed_node3/include_tasks 8454 1726882422.72062: done queuing things up, now waiting for results queue to drain 8454 1726882422.72063: waiting for pending results... 8454 1726882422.72326: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 8454 1726882422.72742: in run() - task 0affe814-3a2d-f59f-16b9-00000000006e 8454 1726882422.72808: variable 'ansible_search_path' from source: unknown 8454 1726882422.72812: variable 'ansible_search_path' from source: unknown 8454 1726882422.72817: calling self._execute() 8454 1726882422.73063: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882422.73068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882422.73173: variable 'omit' from source: magic vars 8454 1726882422.74007: variable 'ansible_distribution_major_version' from source: facts 8454 1726882422.74030: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882422.74073: _execute() done 8454 1726882422.74083: dumping result to json 8454 1726882422.74094: done dumping result, returning 8454 1726882422.74105: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-f59f-16b9-00000000006e] 8454 1726882422.74117: sending task result for task 0affe814-3a2d-f59f-16b9-00000000006e 8454 1726882422.74308: no more pending results, returning what we have 8454 1726882422.74315: in VariableManager get_vars() 8454 1726882422.74445: Calling all_inventory to load vars for managed_node3 8454 1726882422.74450: Calling groups_inventory to load vars for managed_node3 8454 1726882422.74453: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882422.74470: Calling all_plugins_play to load vars for managed_node3 8454 1726882422.74482: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882422.74489: Calling groups_plugins_play to load vars for managed_node3 8454 1726882422.75051: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000006e 8454 1726882422.75055: WORKER PROCESS EXITING 8454 1726882422.78125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882422.84371: done with get_vars() 8454 1726882422.84529: variable 'ansible_search_path' from source: unknown 8454 1726882422.84530: variable 'ansible_search_path' from source: unknown 8454 1726882422.84585: we have included files to process 8454 1726882422.84587: generating all_blocks data 8454 1726882422.84589: done generating all_blocks data 8454 1726882422.84594: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882422.84596: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882422.84599: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8454 1726882422.85192: done processing included file 8454 1726882422.85195: iterating over new_blocks loaded from include file 8454 1726882422.85198: in VariableManager get_vars() 8454 1726882422.85230: done with get_vars() 8454 1726882422.85232: filtering new block on tags 8454 1726882422.85257: done filtering new block on tags 8454 1726882422.85261: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 8454 1726882422.85267: extending task lists for all hosts with included blocks 8454 1726882422.85596: done extending task lists 8454 1726882422.85598: done processing included files 8454 1726882422.85599: results queue empty 8454 1726882422.85600: checking for any_errors_fatal 8454 1726882422.85602: done checking for any_errors_fatal 8454 1726882422.85603: checking for max_fail_percentage 8454 1726882422.85604: done checking for max_fail_percentage 8454 1726882422.85606: checking to see if all hosts have failed and the running result is not ok 8454 1726882422.85607: done checking to see if all hosts have failed 8454 1726882422.85608: getting the remaining hosts for this loop 8454 1726882422.85609: done getting the remaining hosts for this loop 8454 1726882422.85612: getting the next task for host managed_node3 8454 1726882422.85617: done getting next task for host managed_node3 8454 1726882422.85620: ^ task is: TASK: Get stat for interface {{ interface }} 8454 1726882422.85624: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882422.85627: getting variables 8454 1726882422.85628: in VariableManager get_vars() 8454 1726882422.85750: Calling all_inventory to load vars for managed_node3 8454 1726882422.85753: Calling groups_inventory to load vars for managed_node3 8454 1726882422.85756: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882422.85763: Calling all_plugins_play to load vars for managed_node3 8454 1726882422.85841: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882422.85846: Calling groups_plugins_play to load vars for managed_node3 8454 1726882422.95805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882423.01491: done with get_vars() 8454 1726882423.01652: done getting variables 8454 1726882423.02142: variable 'interface' from source: task vars 8454 1726882423.02146: variable 'controller_device' from source: play vars 8454 1726882423.02224: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:43 -0400 (0:00:00.307) 0:00:21.040 ****** 8454 1726882423.02294: entering _queue_task() for managed_node3/stat 8454 1726882423.03070: worker is 1 (out of 1 available) 8454 1726882423.03083: exiting _queue_task() for managed_node3/stat 8454 1726882423.03098: done queuing things up, now waiting for results queue to drain 8454 1726882423.03100: waiting for pending results... 8454 1726882423.03618: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 8454 1726882423.03971: in run() - task 0affe814-3a2d-f59f-16b9-000000000241 8454 1726882423.03989: variable 'ansible_search_path' from source: unknown 8454 1726882423.03993: variable 'ansible_search_path' from source: unknown 8454 1726882423.04128: calling self._execute() 8454 1726882423.04132: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.04137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.04352: variable 'omit' from source: magic vars 8454 1726882423.05188: variable 'ansible_distribution_major_version' from source: facts 8454 1726882423.05202: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882423.05214: variable 'omit' from source: magic vars 8454 1726882423.05278: variable 'omit' from source: magic vars 8454 1726882423.05598: variable 'interface' from source: task vars 8454 1726882423.05602: variable 'controller_device' from source: play vars 8454 1726882423.05677: variable 'controller_device' from source: play vars 8454 1726882423.05704: variable 'omit' from source: magic vars 8454 1726882423.05954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882423.05994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882423.06040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882423.06044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882423.06049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882423.06089: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882423.06097: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.06100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.06436: Set connection var ansible_connection to ssh 8454 1726882423.06444: Set connection var ansible_shell_executable to /bin/sh 8454 1726882423.06451: Set connection var ansible_timeout to 10 8454 1726882423.06454: Set connection var ansible_shell_type to sh 8454 1726882423.06465: Set connection var ansible_pipelining to False 8454 1726882423.06472: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882423.06506: variable 'ansible_shell_executable' from source: unknown 8454 1726882423.06510: variable 'ansible_connection' from source: unknown 8454 1726882423.06513: variable 'ansible_module_compression' from source: unknown 8454 1726882423.06515: variable 'ansible_shell_type' from source: unknown 8454 1726882423.06520: variable 'ansible_shell_executable' from source: unknown 8454 1726882423.06523: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.06529: variable 'ansible_pipelining' from source: unknown 8454 1726882423.06532: variable 'ansible_timeout' from source: unknown 8454 1726882423.06753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.07199: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882423.07211: variable 'omit' from source: magic vars 8454 1726882423.07219: starting attempt loop 8454 1726882423.07222: running the handler 8454 1726882423.07241: _low_level_execute_command(): starting 8454 1726882423.07251: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882423.09143: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882423.09146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882423.09352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882423.09358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882423.09584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882423.11403: stdout chunk (state=3): >>>/root <<< 8454 1726882423.11640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882423.11651: stderr chunk (state=3): >>><<< 8454 1726882423.11654: stdout chunk (state=3): >>><<< 8454 1726882423.11690: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882423.11706: _low_level_execute_command(): starting 8454 1726882423.11714: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535 `" && echo ansible-tmp-1726882423.1169136-9198-170211136740535="` echo /root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535 `" ) && sleep 0' 8454 1726882423.12709: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882423.12713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882423.12717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882423.12725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882423.12808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882423.12881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882423.13005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882423.15447: stdout chunk (state=3): >>>ansible-tmp-1726882423.1169136-9198-170211136740535=/root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535 <<< 8454 1726882423.15452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882423.15454: stdout chunk (state=3): >>><<< 8454 1726882423.15457: stderr chunk (state=3): >>><<< 8454 1726882423.15744: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882423.1169136-9198-170211136740535=/root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882423.15747: variable 'ansible_module_compression' from source: unknown 8454 1726882423.15750: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8454 1726882423.15878: variable 'ansible_facts' from source: unknown 8454 1726882423.16083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/AnsiballZ_stat.py 8454 1726882423.16592: Sending initial data 8454 1726882423.16596: Sent initial data (151 bytes) 8454 1726882423.18046: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882423.18055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882423.18199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882423.18539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882423.18543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882423.20297: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882423.20391: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882423.20512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpks4hfsda /root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/AnsiballZ_stat.py <<< 8454 1726882423.20516: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/AnsiballZ_stat.py" <<< 8454 1726882423.20629: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpks4hfsda" to remote "/root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/AnsiballZ_stat.py" <<< 8454 1726882423.22555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882423.22592: stderr chunk (state=3): >>><<< 8454 1726882423.22602: stdout chunk (state=3): >>><<< 8454 1726882423.22631: done transferring module to remote 8454 1726882423.22647: _low_level_execute_command(): starting 8454 1726882423.22653: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/ /root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/AnsiballZ_stat.py && sleep 0' 8454 1726882423.23285: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882423.23302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882423.23313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882423.23328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882423.23344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882423.23353: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882423.23363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882423.23378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882423.23397: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882423.23493: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882423.23516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882423.23680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882423.25689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882423.25693: stdout chunk (state=3): >>><<< 8454 1726882423.25839: stderr chunk (state=3): >>><<< 8454 1726882423.25843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882423.25846: _low_level_execute_command(): starting 8454 1726882423.25848: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/AnsiballZ_stat.py && sleep 0' 8454 1726882423.26427: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882423.26440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882423.26452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882423.26469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882423.26485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882423.26508: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882423.26553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882423.26638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882423.26652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882423.26674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882423.26918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882423.44269: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35546, "dev": 23, "nlink": 1, "atime": 1726882421.2120228, "mtime": 1726882421.2120228, "ctime": 1726882421.2120228, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8454 1726882423.45764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882423.45827: stderr chunk (state=3): >>><<< 8454 1726882423.45841: stdout chunk (state=3): >>><<< 8454 1726882423.45877: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35546, "dev": 23, "nlink": 1, "atime": 1726882421.2120228, "mtime": 1726882421.2120228, "ctime": 1726882421.2120228, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882423.45986: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882423.45991: _low_level_execute_command(): starting 8454 1726882423.46003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882423.1169136-9198-170211136740535/ > /dev/null 2>&1 && sleep 0' 8454 1726882423.46841: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882423.46857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882423.46907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882423.47041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882423.49260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882423.49264: stdout chunk (state=3): >>><<< 8454 1726882423.49267: stderr chunk (state=3): >>><<< 8454 1726882423.49270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882423.49272: handler run complete 8454 1726882423.49298: attempt loop complete, returning result 8454 1726882423.49313: _execute() done 8454 1726882423.49323: dumping result to json 8454 1726882423.49391: done dumping result, returning 8454 1726882423.49396: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [0affe814-3a2d-f59f-16b9-000000000241] 8454 1726882423.49398: sending task result for task 0affe814-3a2d-f59f-16b9-000000000241 8454 1726882423.49855: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000241 8454 1726882423.49859: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882421.2120228, "block_size": 4096, "blocks": 0, "ctime": 1726882421.2120228, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35546, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726882421.2120228, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8454 1726882423.49980: no more pending results, returning what we have 8454 1726882423.49984: results queue empty 8454 1726882423.49985: checking for any_errors_fatal 8454 1726882423.49987: done checking for any_errors_fatal 8454 1726882423.49987: checking for max_fail_percentage 8454 1726882423.49989: done checking for max_fail_percentage 8454 1726882423.49990: checking to see if all hosts have failed and the running result is not ok 8454 1726882423.49991: done checking to see if all hosts have failed 8454 1726882423.49992: getting the remaining hosts for this loop 8454 1726882423.49993: done getting the remaining hosts for this loop 8454 1726882423.49998: getting the next task for host managed_node3 8454 1726882423.50006: done getting next task for host managed_node3 8454 1726882423.50008: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8454 1726882423.50011: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882423.50015: getting variables 8454 1726882423.50017: in VariableManager get_vars() 8454 1726882423.50114: Calling all_inventory to load vars for managed_node3 8454 1726882423.50118: Calling groups_inventory to load vars for managed_node3 8454 1726882423.50121: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882423.50132: Calling all_plugins_play to load vars for managed_node3 8454 1726882423.50138: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882423.50143: Calling groups_plugins_play to load vars for managed_node3 8454 1726882423.52247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882423.55157: done with get_vars() 8454 1726882423.55192: done getting variables 8454 1726882423.55274: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882423.55418: variable 'interface' from source: task vars 8454 1726882423.55422: variable 'controller_device' from source: play vars 8454 1726882423.55498: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:43 -0400 (0:00:00.532) 0:00:21.573 ****** 8454 1726882423.55535: entering _queue_task() for managed_node3/assert 8454 1726882423.55865: worker is 1 (out of 1 available) 8454 1726882423.55878: exiting _queue_task() for managed_node3/assert 8454 1726882423.55892: done queuing things up, now waiting for results queue to drain 8454 1726882423.55894: waiting for pending results... 8454 1726882423.56356: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 8454 1726882423.56361: in run() - task 0affe814-3a2d-f59f-16b9-00000000006f 8454 1726882423.56388: variable 'ansible_search_path' from source: unknown 8454 1726882423.56398: variable 'ansible_search_path' from source: unknown 8454 1726882423.56450: calling self._execute() 8454 1726882423.56574: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.56594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.56613: variable 'omit' from source: magic vars 8454 1726882423.57054: variable 'ansible_distribution_major_version' from source: facts 8454 1726882423.57075: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882423.57087: variable 'omit' from source: magic vars 8454 1726882423.57161: variable 'omit' from source: magic vars 8454 1726882423.57285: variable 'interface' from source: task vars 8454 1726882423.57297: variable 'controller_device' from source: play vars 8454 1726882423.57482: variable 'controller_device' from source: play vars 8454 1726882423.57491: variable 'omit' from source: magic vars 8454 1726882423.57543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882423.57596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882423.57625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882423.57655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882423.57677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882423.57719: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882423.57728: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.57740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.57871: Set connection var ansible_connection to ssh 8454 1726882423.57895: Set connection var ansible_shell_executable to /bin/sh 8454 1726882423.58004: Set connection var ansible_timeout to 10 8454 1726882423.58008: Set connection var ansible_shell_type to sh 8454 1726882423.58010: Set connection var ansible_pipelining to False 8454 1726882423.58013: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882423.58015: variable 'ansible_shell_executable' from source: unknown 8454 1726882423.58017: variable 'ansible_connection' from source: unknown 8454 1726882423.58020: variable 'ansible_module_compression' from source: unknown 8454 1726882423.58022: variable 'ansible_shell_type' from source: unknown 8454 1726882423.58024: variable 'ansible_shell_executable' from source: unknown 8454 1726882423.58026: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.58028: variable 'ansible_pipelining' from source: unknown 8454 1726882423.58030: variable 'ansible_timeout' from source: unknown 8454 1726882423.58032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.58191: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882423.58213: variable 'omit' from source: magic vars 8454 1726882423.58232: starting attempt loop 8454 1726882423.58243: running the handler 8454 1726882423.58414: variable 'interface_stat' from source: set_fact 8454 1726882423.58449: Evaluated conditional (interface_stat.stat.exists): True 8454 1726882423.58461: handler run complete 8454 1726882423.58483: attempt loop complete, returning result 8454 1726882423.58492: _execute() done 8454 1726882423.58500: dumping result to json 8454 1726882423.58507: done dumping result, returning 8454 1726882423.58519: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [0affe814-3a2d-f59f-16b9-00000000006f] 8454 1726882423.58540: sending task result for task 0affe814-3a2d-f59f-16b9-00000000006f 8454 1726882423.58744: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000006f 8454 1726882423.58747: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882423.58825: no more pending results, returning what we have 8454 1726882423.58830: results queue empty 8454 1726882423.58831: checking for any_errors_fatal 8454 1726882423.58842: done checking for any_errors_fatal 8454 1726882423.58844: checking for max_fail_percentage 8454 1726882423.58846: done checking for max_fail_percentage 8454 1726882423.58848: checking to see if all hosts have failed and the running result is not ok 8454 1726882423.58849: done checking to see if all hosts have failed 8454 1726882423.58850: getting the remaining hosts for this loop 8454 1726882423.58853: done getting the remaining hosts for this loop 8454 1726882423.58858: getting the next task for host managed_node3 8454 1726882423.58869: done getting next task for host managed_node3 8454 1726882423.58874: ^ task is: TASK: Include the task 'assert_profile_present.yml' 8454 1726882423.58876: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882423.58881: getting variables 8454 1726882423.58883: in VariableManager get_vars() 8454 1726882423.59039: Calling all_inventory to load vars for managed_node3 8454 1726882423.59044: Calling groups_inventory to load vars for managed_node3 8454 1726882423.59048: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882423.59061: Calling all_plugins_play to load vars for managed_node3 8454 1726882423.59065: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882423.59069: Calling groups_plugins_play to load vars for managed_node3 8454 1726882423.62707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882423.68029: done with get_vars() 8454 1726882423.68272: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Friday 20 September 2024 21:33:43 -0400 (0:00:00.128) 0:00:21.701 ****** 8454 1726882423.68388: entering _queue_task() for managed_node3/include_tasks 8454 1726882423.69156: worker is 1 (out of 1 available) 8454 1726882423.69171: exiting _queue_task() for managed_node3/include_tasks 8454 1726882423.69186: done queuing things up, now waiting for results queue to drain 8454 1726882423.69188: waiting for pending results... 8454 1726882423.69857: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 8454 1726882423.69951: in run() - task 0affe814-3a2d-f59f-16b9-000000000070 8454 1726882423.69966: variable 'ansible_search_path' from source: unknown 8454 1726882423.70138: variable 'controller_profile' from source: play vars 8454 1726882423.70981: variable 'controller_profile' from source: play vars 8454 1726882423.70999: variable 'port1_profile' from source: play vars 8454 1726882423.71085: variable 'port1_profile' from source: play vars 8454 1726882423.71094: variable 'port2_profile' from source: play vars 8454 1726882423.71593: variable 'port2_profile' from source: play vars 8454 1726882423.71596: variable 'omit' from source: magic vars 8454 1726882423.72047: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.72139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.72144: variable 'omit' from source: magic vars 8454 1726882423.72569: variable 'ansible_distribution_major_version' from source: facts 8454 1726882423.72579: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882423.72616: variable 'item' from source: unknown 8454 1726882423.72897: variable 'item' from source: unknown 8454 1726882423.73141: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.73145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.73148: variable 'omit' from source: magic vars 8454 1726882423.73403: variable 'ansible_distribution_major_version' from source: facts 8454 1726882423.73408: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882423.73645: variable 'item' from source: unknown 8454 1726882423.73839: variable 'item' from source: unknown 8454 1726882423.73906: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.73909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.73912: variable 'omit' from source: magic vars 8454 1726882423.74340: variable 'ansible_distribution_major_version' from source: facts 8454 1726882423.74738: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882423.74742: variable 'item' from source: unknown 8454 1726882423.74745: variable 'item' from source: unknown 8454 1726882423.74808: dumping result to json 8454 1726882423.74811: done dumping result, returning 8454 1726882423.74814: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0affe814-3a2d-f59f-16b9-000000000070] 8454 1726882423.74817: sending task result for task 0affe814-3a2d-f59f-16b9-000000000070 8454 1726882423.74863: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000070 8454 1726882423.74866: WORKER PROCESS EXITING 8454 1726882423.74951: no more pending results, returning what we have 8454 1726882423.74957: in VariableManager get_vars() 8454 1726882423.75013: Calling all_inventory to load vars for managed_node3 8454 1726882423.75017: Calling groups_inventory to load vars for managed_node3 8454 1726882423.75020: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882423.75037: Calling all_plugins_play to load vars for managed_node3 8454 1726882423.75042: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882423.75046: Calling groups_plugins_play to load vars for managed_node3 8454 1726882423.79746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882423.86125: done with get_vars() 8454 1726882423.86208: variable 'ansible_search_path' from source: unknown 8454 1726882423.86302: variable 'ansible_search_path' from source: unknown 8454 1726882423.86318: variable 'ansible_search_path' from source: unknown 8454 1726882423.86328: we have included files to process 8454 1726882423.86329: generating all_blocks data 8454 1726882423.86331: done generating all_blocks data 8454 1726882423.86341: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.86343: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.86347: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.86918: in VariableManager get_vars() 8454 1726882423.87017: done with get_vars() 8454 1726882423.87752: done processing included file 8454 1726882423.87759: iterating over new_blocks loaded from include file 8454 1726882423.87761: in VariableManager get_vars() 8454 1726882423.87785: done with get_vars() 8454 1726882423.87788: filtering new block on tags 8454 1726882423.87816: done filtering new block on tags 8454 1726882423.87819: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 8454 1726882423.87825: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.87826: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.87830: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.88179: in VariableManager get_vars() 8454 1726882423.88321: done with get_vars() 8454 1726882423.89163: done processing included file 8454 1726882423.89165: iterating over new_blocks loaded from include file 8454 1726882423.89167: in VariableManager get_vars() 8454 1726882423.89189: done with get_vars() 8454 1726882423.89191: filtering new block on tags 8454 1726882423.89216: done filtering new block on tags 8454 1726882423.89219: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 8454 1726882423.89224: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.89226: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.89229: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8454 1726882423.89757: in VariableManager get_vars() 8454 1726882423.89784: done with get_vars() 8454 1726882423.90070: done processing included file 8454 1726882423.90072: iterating over new_blocks loaded from include file 8454 1726882423.90074: in VariableManager get_vars() 8454 1726882423.90096: done with get_vars() 8454 1726882423.90099: filtering new block on tags 8454 1726882423.90123: done filtering new block on tags 8454 1726882423.90126: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 8454 1726882423.90130: extending task lists for all hosts with included blocks 8454 1726882423.93908: done extending task lists 8454 1726882423.93915: done processing included files 8454 1726882423.93916: results queue empty 8454 1726882423.93917: checking for any_errors_fatal 8454 1726882423.93921: done checking for any_errors_fatal 8454 1726882423.93922: checking for max_fail_percentage 8454 1726882423.93924: done checking for max_fail_percentage 8454 1726882423.93925: checking to see if all hosts have failed and the running result is not ok 8454 1726882423.93926: done checking to see if all hosts have failed 8454 1726882423.93927: getting the remaining hosts for this loop 8454 1726882423.93928: done getting the remaining hosts for this loop 8454 1726882423.93932: getting the next task for host managed_node3 8454 1726882423.93939: done getting next task for host managed_node3 8454 1726882423.93941: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8454 1726882423.93944: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882423.93947: getting variables 8454 1726882423.93948: in VariableManager get_vars() 8454 1726882423.93964: Calling all_inventory to load vars for managed_node3 8454 1726882423.93967: Calling groups_inventory to load vars for managed_node3 8454 1726882423.93970: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882423.93977: Calling all_plugins_play to load vars for managed_node3 8454 1726882423.93980: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882423.93984: Calling groups_plugins_play to load vars for managed_node3 8454 1726882423.95920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882423.98856: done with get_vars() 8454 1726882423.98886: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:43 -0400 (0:00:00.305) 0:00:22.007 ****** 8454 1726882423.98976: entering _queue_task() for managed_node3/include_tasks 8454 1726882423.99542: worker is 1 (out of 1 available) 8454 1726882423.99552: exiting _queue_task() for managed_node3/include_tasks 8454 1726882423.99566: done queuing things up, now waiting for results queue to drain 8454 1726882423.99567: waiting for pending results... 8454 1726882423.99766: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 8454 1726882423.99863: in run() - task 0affe814-3a2d-f59f-16b9-00000000025f 8454 1726882423.99868: variable 'ansible_search_path' from source: unknown 8454 1726882423.99871: variable 'ansible_search_path' from source: unknown 8454 1726882423.99874: calling self._execute() 8454 1726882423.99950: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882423.99957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882423.99970: variable 'omit' from source: magic vars 8454 1726882424.00515: variable 'ansible_distribution_major_version' from source: facts 8454 1726882424.00518: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882424.00521: _execute() done 8454 1726882424.00523: dumping result to json 8454 1726882424.00525: done dumping result, returning 8454 1726882424.00527: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-f59f-16b9-00000000025f] 8454 1726882424.00529: sending task result for task 0affe814-3a2d-f59f-16b9-00000000025f 8454 1726882424.00601: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000025f 8454 1726882424.00604: WORKER PROCESS EXITING 8454 1726882424.00638: no more pending results, returning what we have 8454 1726882424.00645: in VariableManager get_vars() 8454 1726882424.00698: Calling all_inventory to load vars for managed_node3 8454 1726882424.00701: Calling groups_inventory to load vars for managed_node3 8454 1726882424.00705: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882424.00720: Calling all_plugins_play to load vars for managed_node3 8454 1726882424.00724: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882424.00729: Calling groups_plugins_play to load vars for managed_node3 8454 1726882424.03944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882424.07372: done with get_vars() 8454 1726882424.07405: variable 'ansible_search_path' from source: unknown 8454 1726882424.07406: variable 'ansible_search_path' from source: unknown 8454 1726882424.07455: we have included files to process 8454 1726882424.07457: generating all_blocks data 8454 1726882424.07459: done generating all_blocks data 8454 1726882424.07461: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882424.07462: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882424.07465: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882424.08798: done processing included file 8454 1726882424.08801: iterating over new_blocks loaded from include file 8454 1726882424.08803: in VariableManager get_vars() 8454 1726882424.08828: done with get_vars() 8454 1726882424.08830: filtering new block on tags 8454 1726882424.08866: done filtering new block on tags 8454 1726882424.08869: in VariableManager get_vars() 8454 1726882424.08894: done with get_vars() 8454 1726882424.08895: filtering new block on tags 8454 1726882424.08924: done filtering new block on tags 8454 1726882424.08926: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 8454 1726882424.08932: extending task lists for all hosts with included blocks 8454 1726882424.09166: done extending task lists 8454 1726882424.09168: done processing included files 8454 1726882424.09169: results queue empty 8454 1726882424.09170: checking for any_errors_fatal 8454 1726882424.09174: done checking for any_errors_fatal 8454 1726882424.09175: checking for max_fail_percentage 8454 1726882424.09177: done checking for max_fail_percentage 8454 1726882424.09178: checking to see if all hosts have failed and the running result is not ok 8454 1726882424.09179: done checking to see if all hosts have failed 8454 1726882424.09180: getting the remaining hosts for this loop 8454 1726882424.09181: done getting the remaining hosts for this loop 8454 1726882424.09185: getting the next task for host managed_node3 8454 1726882424.09190: done getting next task for host managed_node3 8454 1726882424.09193: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8454 1726882424.09196: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882424.09198: getting variables 8454 1726882424.09200: in VariableManager get_vars() 8454 1726882424.09298: Calling all_inventory to load vars for managed_node3 8454 1726882424.09301: Calling groups_inventory to load vars for managed_node3 8454 1726882424.09304: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882424.09311: Calling all_plugins_play to load vars for managed_node3 8454 1726882424.09314: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882424.09318: Calling groups_plugins_play to load vars for managed_node3 8454 1726882424.11280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882424.14150: done with get_vars() 8454 1726882424.14181: done getting variables 8454 1726882424.14227: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:44 -0400 (0:00:00.152) 0:00:22.160 ****** 8454 1726882424.14264: entering _queue_task() for managed_node3/set_fact 8454 1726882424.14621: worker is 1 (out of 1 available) 8454 1726882424.14738: exiting _queue_task() for managed_node3/set_fact 8454 1726882424.14753: done queuing things up, now waiting for results queue to drain 8454 1726882424.14755: waiting for pending results... 8454 1726882424.15153: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 8454 1726882424.15158: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b0 8454 1726882424.15162: variable 'ansible_search_path' from source: unknown 8454 1726882424.15166: variable 'ansible_search_path' from source: unknown 8454 1726882424.15169: calling self._execute() 8454 1726882424.15245: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.15251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.15263: variable 'omit' from source: magic vars 8454 1726882424.15940: variable 'ansible_distribution_major_version' from source: facts 8454 1726882424.15943: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882424.15947: variable 'omit' from source: magic vars 8454 1726882424.15950: variable 'omit' from source: magic vars 8454 1726882424.15952: variable 'omit' from source: magic vars 8454 1726882424.15955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882424.15957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882424.15960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882424.15969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882424.15987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882424.16022: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882424.16026: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.16031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.16158: Set connection var ansible_connection to ssh 8454 1726882424.16339: Set connection var ansible_shell_executable to /bin/sh 8454 1726882424.16343: Set connection var ansible_timeout to 10 8454 1726882424.16347: Set connection var ansible_shell_type to sh 8454 1726882424.16350: Set connection var ansible_pipelining to False 8454 1726882424.16352: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882424.16354: variable 'ansible_shell_executable' from source: unknown 8454 1726882424.16357: variable 'ansible_connection' from source: unknown 8454 1726882424.16360: variable 'ansible_module_compression' from source: unknown 8454 1726882424.16362: variable 'ansible_shell_type' from source: unknown 8454 1726882424.16365: variable 'ansible_shell_executable' from source: unknown 8454 1726882424.16367: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.16369: variable 'ansible_pipelining' from source: unknown 8454 1726882424.16372: variable 'ansible_timeout' from source: unknown 8454 1726882424.16374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.16418: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882424.16436: variable 'omit' from source: magic vars 8454 1726882424.16449: starting attempt loop 8454 1726882424.16452: running the handler 8454 1726882424.16467: handler run complete 8454 1726882424.16482: attempt loop complete, returning result 8454 1726882424.16485: _execute() done 8454 1726882424.16489: dumping result to json 8454 1726882424.16492: done dumping result, returning 8454 1726882424.16499: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-f59f-16b9-0000000003b0] 8454 1726882424.16506: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b0 8454 1726882424.16604: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b0 8454 1726882424.16607: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8454 1726882424.16701: no more pending results, returning what we have 8454 1726882424.16705: results queue empty 8454 1726882424.16706: checking for any_errors_fatal 8454 1726882424.16708: done checking for any_errors_fatal 8454 1726882424.16709: checking for max_fail_percentage 8454 1726882424.16711: done checking for max_fail_percentage 8454 1726882424.16713: checking to see if all hosts have failed and the running result is not ok 8454 1726882424.16714: done checking to see if all hosts have failed 8454 1726882424.16715: getting the remaining hosts for this loop 8454 1726882424.16717: done getting the remaining hosts for this loop 8454 1726882424.16721: getting the next task for host managed_node3 8454 1726882424.16730: done getting next task for host managed_node3 8454 1726882424.16733: ^ task is: TASK: Stat profile file 8454 1726882424.16740: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882424.16745: getting variables 8454 1726882424.16746: in VariableManager get_vars() 8454 1726882424.16791: Calling all_inventory to load vars for managed_node3 8454 1726882424.16795: Calling groups_inventory to load vars for managed_node3 8454 1726882424.16798: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882424.16811: Calling all_plugins_play to load vars for managed_node3 8454 1726882424.16815: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882424.16819: Calling groups_plugins_play to load vars for managed_node3 8454 1726882424.20760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882424.26765: done with get_vars() 8454 1726882424.26808: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:44 -0400 (0:00:00.126) 0:00:22.286 ****** 8454 1726882424.26922: entering _queue_task() for managed_node3/stat 8454 1726882424.27693: worker is 1 (out of 1 available) 8454 1726882424.27707: exiting _queue_task() for managed_node3/stat 8454 1726882424.27722: done queuing things up, now waiting for results queue to drain 8454 1726882424.27724: waiting for pending results... 8454 1726882424.28457: running TaskExecutor() for managed_node3/TASK: Stat profile file 8454 1726882424.28643: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b1 8454 1726882424.28648: variable 'ansible_search_path' from source: unknown 8454 1726882424.28651: variable 'ansible_search_path' from source: unknown 8454 1726882424.28655: calling self._execute() 8454 1726882424.28845: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.28853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.28867: variable 'omit' from source: magic vars 8454 1726882424.29940: variable 'ansible_distribution_major_version' from source: facts 8454 1726882424.29945: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882424.29947: variable 'omit' from source: magic vars 8454 1726882424.29961: variable 'omit' from source: magic vars 8454 1726882424.30261: variable 'profile' from source: include params 8454 1726882424.30266: variable 'item' from source: include params 8454 1726882424.30463: variable 'item' from source: include params 8454 1726882424.30541: variable 'omit' from source: magic vars 8454 1726882424.30651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882424.30693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882424.30719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882424.30884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882424.30889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882424.31003: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882424.31006: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.31009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.31173: Set connection var ansible_connection to ssh 8454 1726882424.31305: Set connection var ansible_shell_executable to /bin/sh 8454 1726882424.31313: Set connection var ansible_timeout to 10 8454 1726882424.31316: Set connection var ansible_shell_type to sh 8454 1726882424.31328: Set connection var ansible_pipelining to False 8454 1726882424.31341: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882424.31365: variable 'ansible_shell_executable' from source: unknown 8454 1726882424.31369: variable 'ansible_connection' from source: unknown 8454 1726882424.31372: variable 'ansible_module_compression' from source: unknown 8454 1726882424.31380: variable 'ansible_shell_type' from source: unknown 8454 1726882424.31383: variable 'ansible_shell_executable' from source: unknown 8454 1726882424.31385: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.31390: variable 'ansible_pipelining' from source: unknown 8454 1726882424.31393: variable 'ansible_timeout' from source: unknown 8454 1726882424.31740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.31801: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882424.31812: variable 'omit' from source: magic vars 8454 1726882424.31819: starting attempt loop 8454 1726882424.31823: running the handler 8454 1726882424.31842: _low_level_execute_command(): starting 8454 1726882424.31852: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882424.33406: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882424.33582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882424.33753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882424.35603: stdout chunk (state=3): >>>/root <<< 8454 1726882424.35709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882424.35822: stderr chunk (state=3): >>><<< 8454 1726882424.35826: stdout chunk (state=3): >>><<< 8454 1726882424.35874: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882424.35890: _low_level_execute_command(): starting 8454 1726882424.35898: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534 `" && echo ansible-tmp-1726882424.358745-9238-216022247494534="` echo /root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534 `" ) && sleep 0' 8454 1726882424.37251: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.37296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882424.37319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882424.37340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882424.37490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882424.39603: stdout chunk (state=3): >>>ansible-tmp-1726882424.358745-9238-216022247494534=/root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534 <<< 8454 1726882424.39752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882424.40141: stderr chunk (state=3): >>><<< 8454 1726882424.40145: stdout chunk (state=3): >>><<< 8454 1726882424.40148: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882424.358745-9238-216022247494534=/root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882424.40151: variable 'ansible_module_compression' from source: unknown 8454 1726882424.40154: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8454 1726882424.40540: variable 'ansible_facts' from source: unknown 8454 1726882424.40544: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/AnsiballZ_stat.py 8454 1726882424.40760: Sending initial data 8454 1726882424.40773: Sent initial data (150 bytes) 8454 1726882424.41952: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882424.42074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882424.42372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882424.42489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882424.44282: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882424.44391: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882424.44513: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpelye3rff /root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/AnsiballZ_stat.py <<< 8454 1726882424.44517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/AnsiballZ_stat.py" <<< 8454 1726882424.44626: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpelye3rff" to remote "/root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/AnsiballZ_stat.py" <<< 8454 1726882424.46996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882424.47030: stderr chunk (state=3): >>><<< 8454 1726882424.47037: stdout chunk (state=3): >>><<< 8454 1726882424.47065: done transferring module to remote 8454 1726882424.47081: _low_level_execute_command(): starting 8454 1726882424.47140: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/ /root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/AnsiballZ_stat.py && sleep 0' 8454 1726882424.48570: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882424.48578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882424.48654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.48661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882424.48667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882424.48866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.48870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882424.48877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882424.48948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882424.49152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882424.51925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882424.52091: stderr chunk (state=3): >>><<< 8454 1726882424.52099: stdout chunk (state=3): >>><<< 8454 1726882424.52176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882424.52180: _low_level_execute_command(): starting 8454 1726882424.52199: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/AnsiballZ_stat.py && sleep 0' 8454 1726882424.52927: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882424.52939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882424.52954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882424.52989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.53122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.53126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882424.53129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882424.53196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882424.53471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882424.72914: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8454 1726882424.74647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882424.74652: stdout chunk (state=3): >>><<< 8454 1726882424.74654: stderr chunk (state=3): >>><<< 8454 1726882424.74657: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882424.74660: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882424.74663: _low_level_execute_command(): starting 8454 1726882424.74666: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882424.358745-9238-216022247494534/ > /dev/null 2>&1 && sleep 0' 8454 1726882424.75838: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882424.75844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882424.75863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.75869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882424.75887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882424.75895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.75978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882424.76005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882424.76020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882424.76170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882424.78340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882424.78357: stdout chunk (state=3): >>><<< 8454 1726882424.78366: stderr chunk (state=3): >>><<< 8454 1726882424.78385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882424.78740: handler run complete 8454 1726882424.78743: attempt loop complete, returning result 8454 1726882424.78746: _execute() done 8454 1726882424.78748: dumping result to json 8454 1726882424.78751: done dumping result, returning 8454 1726882424.78753: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affe814-3a2d-f59f-16b9-0000000003b1] 8454 1726882424.78756: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b1 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8454 1726882424.79010: no more pending results, returning what we have 8454 1726882424.79144: results queue empty 8454 1726882424.79146: checking for any_errors_fatal 8454 1726882424.79155: done checking for any_errors_fatal 8454 1726882424.79156: checking for max_fail_percentage 8454 1726882424.79158: done checking for max_fail_percentage 8454 1726882424.79159: checking to see if all hosts have failed and the running result is not ok 8454 1726882424.79160: done checking to see if all hosts have failed 8454 1726882424.79161: getting the remaining hosts for this loop 8454 1726882424.79163: done getting the remaining hosts for this loop 8454 1726882424.79168: getting the next task for host managed_node3 8454 1726882424.79174: done getting next task for host managed_node3 8454 1726882424.79177: ^ task is: TASK: Set NM profile exist flag based on the profile files 8454 1726882424.79182: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882424.79186: getting variables 8454 1726882424.79187: in VariableManager get_vars() 8454 1726882424.79229: Calling all_inventory to load vars for managed_node3 8454 1726882424.79232: Calling groups_inventory to load vars for managed_node3 8454 1726882424.79340: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882424.79347: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b1 8454 1726882424.79351: WORKER PROCESS EXITING 8454 1726882424.79363: Calling all_plugins_play to load vars for managed_node3 8454 1726882424.79367: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882424.79371: Calling groups_plugins_play to load vars for managed_node3 8454 1726882424.82600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882424.87248: done with get_vars() 8454 1726882424.87290: done getting variables 8454 1726882424.87362: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:44 -0400 (0:00:00.604) 0:00:22.891 ****** 8454 1726882424.87397: entering _queue_task() for managed_node3/set_fact 8454 1726882424.87804: worker is 1 (out of 1 available) 8454 1726882424.87815: exiting _queue_task() for managed_node3/set_fact 8454 1726882424.87830: done queuing things up, now waiting for results queue to drain 8454 1726882424.87832: waiting for pending results... 8454 1726882424.88127: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 8454 1726882424.88278: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b2 8454 1726882424.88299: variable 'ansible_search_path' from source: unknown 8454 1726882424.88307: variable 'ansible_search_path' from source: unknown 8454 1726882424.88355: calling self._execute() 8454 1726882424.88460: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.88478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.88498: variable 'omit' from source: magic vars 8454 1726882424.88939: variable 'ansible_distribution_major_version' from source: facts 8454 1726882424.88958: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882424.89120: variable 'profile_stat' from source: set_fact 8454 1726882424.89150: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882424.89159: when evaluation is False, skipping this task 8454 1726882424.89166: _execute() done 8454 1726882424.89175: dumping result to json 8454 1726882424.89184: done dumping result, returning 8454 1726882424.89195: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-f59f-16b9-0000000003b2] 8454 1726882424.89218: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b2 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882424.89398: no more pending results, returning what we have 8454 1726882424.89404: results queue empty 8454 1726882424.89405: checking for any_errors_fatal 8454 1726882424.89417: done checking for any_errors_fatal 8454 1726882424.89418: checking for max_fail_percentage 8454 1726882424.89420: done checking for max_fail_percentage 8454 1726882424.89422: checking to see if all hosts have failed and the running result is not ok 8454 1726882424.89422: done checking to see if all hosts have failed 8454 1726882424.89423: getting the remaining hosts for this loop 8454 1726882424.89425: done getting the remaining hosts for this loop 8454 1726882424.89430: getting the next task for host managed_node3 8454 1726882424.89440: done getting next task for host managed_node3 8454 1726882424.89445: ^ task is: TASK: Get NM profile info 8454 1726882424.89455: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882424.89462: getting variables 8454 1726882424.89467: in VariableManager get_vars() 8454 1726882424.89514: Calling all_inventory to load vars for managed_node3 8454 1726882424.89518: Calling groups_inventory to load vars for managed_node3 8454 1726882424.89521: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882424.89941: Calling all_plugins_play to load vars for managed_node3 8454 1726882424.89945: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882424.89951: Calling groups_plugins_play to load vars for managed_node3 8454 1726882424.90676: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b2 8454 1726882424.90680: WORKER PROCESS EXITING 8454 1726882424.92167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882424.94081: done with get_vars() 8454 1726882424.94114: done getting variables 8454 1726882424.94182: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:44 -0400 (0:00:00.068) 0:00:22.959 ****** 8454 1726882424.94213: entering _queue_task() for managed_node3/shell 8454 1726882424.94514: worker is 1 (out of 1 available) 8454 1726882424.94535: exiting _queue_task() for managed_node3/shell 8454 1726882424.94549: done queuing things up, now waiting for results queue to drain 8454 1726882424.94550: waiting for pending results... 8454 1726882424.94868: running TaskExecutor() for managed_node3/TASK: Get NM profile info 8454 1726882424.94986: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b3 8454 1726882424.95000: variable 'ansible_search_path' from source: unknown 8454 1726882424.95003: variable 'ansible_search_path' from source: unknown 8454 1726882424.95053: calling self._execute() 8454 1726882424.95138: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.95147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.95170: variable 'omit' from source: magic vars 8454 1726882424.95640: variable 'ansible_distribution_major_version' from source: facts 8454 1726882424.95644: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882424.95647: variable 'omit' from source: magic vars 8454 1726882424.95697: variable 'omit' from source: magic vars 8454 1726882424.95815: variable 'profile' from source: include params 8454 1726882424.95819: variable 'item' from source: include params 8454 1726882424.95899: variable 'item' from source: include params 8454 1726882424.96020: variable 'omit' from source: magic vars 8454 1726882424.96024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882424.96029: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882424.96031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882424.96053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882424.96066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882424.96105: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882424.96109: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.96113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.96279: Set connection var ansible_connection to ssh 8454 1726882424.96282: Set connection var ansible_shell_executable to /bin/sh 8454 1726882424.96285: Set connection var ansible_timeout to 10 8454 1726882424.96288: Set connection var ansible_shell_type to sh 8454 1726882424.96290: Set connection var ansible_pipelining to False 8454 1726882424.96292: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882424.96308: variable 'ansible_shell_executable' from source: unknown 8454 1726882424.96311: variable 'ansible_connection' from source: unknown 8454 1726882424.96319: variable 'ansible_module_compression' from source: unknown 8454 1726882424.96321: variable 'ansible_shell_type' from source: unknown 8454 1726882424.96324: variable 'ansible_shell_executable' from source: unknown 8454 1726882424.96329: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882424.96350: variable 'ansible_pipelining' from source: unknown 8454 1726882424.96354: variable 'ansible_timeout' from source: unknown 8454 1726882424.96356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882424.96503: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882424.96513: variable 'omit' from source: magic vars 8454 1726882424.96518: starting attempt loop 8454 1726882424.96522: running the handler 8454 1726882424.96532: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882424.96551: _low_level_execute_command(): starting 8454 1726882424.96558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882424.97094: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882424.97097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882424.97100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.97104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882424.97107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.97160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882424.97164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882424.97290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882424.99142: stdout chunk (state=3): >>>/root <<< 8454 1726882424.99251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882424.99297: stderr chunk (state=3): >>><<< 8454 1726882424.99301: stdout chunk (state=3): >>><<< 8454 1726882424.99324: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882424.99338: _low_level_execute_command(): starting 8454 1726882424.99343: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777 `" && echo ansible-tmp-1726882424.9932437-9264-113045006450777="` echo /root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777 `" ) && sleep 0' 8454 1726882424.99773: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.99784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882424.99787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882424.99837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882424.99841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882424.99966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882425.02081: stdout chunk (state=3): >>>ansible-tmp-1726882424.9932437-9264-113045006450777=/root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777 <<< 8454 1726882425.02197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882425.02241: stderr chunk (state=3): >>><<< 8454 1726882425.02245: stdout chunk (state=3): >>><<< 8454 1726882425.02267: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882424.9932437-9264-113045006450777=/root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882425.02292: variable 'ansible_module_compression' from source: unknown 8454 1726882425.02331: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882425.02366: variable 'ansible_facts' from source: unknown 8454 1726882425.02418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/AnsiballZ_command.py 8454 1726882425.02524: Sending initial data 8454 1726882425.02527: Sent initial data (154 bytes) 8454 1726882425.02966: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882425.02970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882425.02972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882425.02975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882425.03028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882425.03038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882425.03154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882425.04876: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 8454 1726882425.04880: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882425.04987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882425.05099: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmptmszu58i /root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/AnsiballZ_command.py <<< 8454 1726882425.05107: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/AnsiballZ_command.py" <<< 8454 1726882425.05212: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmptmszu58i" to remote "/root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/AnsiballZ_command.py" <<< 8454 1726882425.06289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882425.06346: stderr chunk (state=3): >>><<< 8454 1726882425.06349: stdout chunk (state=3): >>><<< 8454 1726882425.06367: done transferring module to remote 8454 1726882425.06380: _low_level_execute_command(): starting 8454 1726882425.06384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/ /root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/AnsiballZ_command.py && sleep 0' 8454 1726882425.06819: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882425.06824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882425.06827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882425.06829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882425.06832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882425.06894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882425.06897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882425.07006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882425.09001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882425.09048: stderr chunk (state=3): >>><<< 8454 1726882425.09051: stdout chunk (state=3): >>><<< 8454 1726882425.09064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882425.09067: _low_level_execute_command(): starting 8454 1726882425.09073: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/AnsiballZ_command.py && sleep 0' 8454 1726882425.09505: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882425.09508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882425.09511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882425.09513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882425.09576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882425.09581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882425.09698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882425.29621: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:45.270280", "end": "2024-09-20 21:33:45.294019", "delta": "0:00:00.023739", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882425.31575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882425.31582: stdout chunk (state=3): >>><<< 8454 1726882425.31585: stderr chunk (state=3): >>><<< 8454 1726882425.31761: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:45.270280", "end": "2024-09-20 21:33:45.294019", "delta": "0:00:00.023739", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882425.31843: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882425.31846: _low_level_execute_command(): starting 8454 1726882425.31849: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882424.9932437-9264-113045006450777/ > /dev/null 2>&1 && sleep 0' 8454 1726882425.33092: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882425.33096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882425.33098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882425.33104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882425.33106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882425.33108: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882425.33111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882425.33231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882425.33249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882425.33264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882425.33420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882425.35451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882425.35520: stderr chunk (state=3): >>><<< 8454 1726882425.35584: stdout chunk (state=3): >>><<< 8454 1726882425.35603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882425.35611: handler run complete 8454 1726882425.35646: Evaluated conditional (False): False 8454 1726882425.35668: attempt loop complete, returning result 8454 1726882425.35671: _execute() done 8454 1726882425.35674: dumping result to json 8454 1726882425.35844: done dumping result, returning 8454 1726882425.35858: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affe814-3a2d-f59f-16b9-0000000003b3] 8454 1726882425.35861: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b3 8454 1726882425.35980: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b3 8454 1726882425.35983: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.023739", "end": "2024-09-20 21:33:45.294019", "rc": 0, "start": "2024-09-20 21:33:45.270280" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 8454 1726882425.36089: no more pending results, returning what we have 8454 1726882425.36094: results queue empty 8454 1726882425.36096: checking for any_errors_fatal 8454 1726882425.36104: done checking for any_errors_fatal 8454 1726882425.36105: checking for max_fail_percentage 8454 1726882425.36108: done checking for max_fail_percentage 8454 1726882425.36109: checking to see if all hosts have failed and the running result is not ok 8454 1726882425.36110: done checking to see if all hosts have failed 8454 1726882425.36111: getting the remaining hosts for this loop 8454 1726882425.36113: done getting the remaining hosts for this loop 8454 1726882425.36119: getting the next task for host managed_node3 8454 1726882425.36128: done getting next task for host managed_node3 8454 1726882425.36132: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8454 1726882425.36139: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882425.36144: getting variables 8454 1726882425.36147: in VariableManager get_vars() 8454 1726882425.36197: Calling all_inventory to load vars for managed_node3 8454 1726882425.36201: Calling groups_inventory to load vars for managed_node3 8454 1726882425.36203: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882425.36216: Calling all_plugins_play to load vars for managed_node3 8454 1726882425.36219: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882425.36223: Calling groups_plugins_play to load vars for managed_node3 8454 1726882425.38994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882425.41974: done with get_vars() 8454 1726882425.42015: done getting variables 8454 1726882425.42088: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:45 -0400 (0:00:00.479) 0:00:23.438 ****** 8454 1726882425.42129: entering _queue_task() for managed_node3/set_fact 8454 1726882425.42569: worker is 1 (out of 1 available) 8454 1726882425.42584: exiting _queue_task() for managed_node3/set_fact 8454 1726882425.42597: done queuing things up, now waiting for results queue to drain 8454 1726882425.42599: waiting for pending results... 8454 1726882425.42876: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8454 1726882425.43087: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b4 8454 1726882425.43091: variable 'ansible_search_path' from source: unknown 8454 1726882425.43093: variable 'ansible_search_path' from source: unknown 8454 1726882425.43097: calling self._execute() 8454 1726882425.43167: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882425.43184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882425.43213: variable 'omit' from source: magic vars 8454 1726882425.43679: variable 'ansible_distribution_major_version' from source: facts 8454 1726882425.43701: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882425.43892: variable 'nm_profile_exists' from source: set_fact 8454 1726882425.43960: Evaluated conditional (nm_profile_exists.rc == 0): True 8454 1726882425.43968: variable 'omit' from source: magic vars 8454 1726882425.44006: variable 'omit' from source: magic vars 8454 1726882425.44052: variable 'omit' from source: magic vars 8454 1726882425.44115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882425.44161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882425.44200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882425.44286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882425.44289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882425.44293: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882425.44297: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882425.44307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882425.44444: Set connection var ansible_connection to ssh 8454 1726882425.44460: Set connection var ansible_shell_executable to /bin/sh 8454 1726882425.44504: Set connection var ansible_timeout to 10 8454 1726882425.44507: Set connection var ansible_shell_type to sh 8454 1726882425.44509: Set connection var ansible_pipelining to False 8454 1726882425.44512: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882425.44545: variable 'ansible_shell_executable' from source: unknown 8454 1726882425.44554: variable 'ansible_connection' from source: unknown 8454 1726882425.44562: variable 'ansible_module_compression' from source: unknown 8454 1726882425.44612: variable 'ansible_shell_type' from source: unknown 8454 1726882425.44615: variable 'ansible_shell_executable' from source: unknown 8454 1726882425.44617: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882425.44620: variable 'ansible_pipelining' from source: unknown 8454 1726882425.44626: variable 'ansible_timeout' from source: unknown 8454 1726882425.44629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882425.44797: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882425.44815: variable 'omit' from source: magic vars 8454 1726882425.44831: starting attempt loop 8454 1726882425.44843: running the handler 8454 1726882425.44937: handler run complete 8454 1726882425.44940: attempt loop complete, returning result 8454 1726882425.44945: _execute() done 8454 1726882425.44948: dumping result to json 8454 1726882425.44950: done dumping result, returning 8454 1726882425.44953: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-f59f-16b9-0000000003b4] 8454 1726882425.44955: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b4 8454 1726882425.45026: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b4 8454 1726882425.45029: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8454 1726882425.45110: no more pending results, returning what we have 8454 1726882425.45114: results queue empty 8454 1726882425.45115: checking for any_errors_fatal 8454 1726882425.45123: done checking for any_errors_fatal 8454 1726882425.45124: checking for max_fail_percentage 8454 1726882425.45126: done checking for max_fail_percentage 8454 1726882425.45128: checking to see if all hosts have failed and the running result is not ok 8454 1726882425.45129: done checking to see if all hosts have failed 8454 1726882425.45130: getting the remaining hosts for this loop 8454 1726882425.45132: done getting the remaining hosts for this loop 8454 1726882425.45340: getting the next task for host managed_node3 8454 1726882425.45350: done getting next task for host managed_node3 8454 1726882425.45353: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8454 1726882425.45358: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882425.45362: getting variables 8454 1726882425.45364: in VariableManager get_vars() 8454 1726882425.45403: Calling all_inventory to load vars for managed_node3 8454 1726882425.45406: Calling groups_inventory to load vars for managed_node3 8454 1726882425.45409: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882425.45420: Calling all_plugins_play to load vars for managed_node3 8454 1726882425.45423: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882425.45427: Calling groups_plugins_play to load vars for managed_node3 8454 1726882425.47742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882425.51411: done with get_vars() 8454 1726882425.51450: done getting variables 8454 1726882425.51520: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882425.51666: variable 'profile' from source: include params 8454 1726882425.51671: variable 'item' from source: include params 8454 1726882425.51751: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:45 -0400 (0:00:00.096) 0:00:23.535 ****** 8454 1726882425.51790: entering _queue_task() for managed_node3/command 8454 1726882425.52247: worker is 1 (out of 1 available) 8454 1726882425.52260: exiting _queue_task() for managed_node3/command 8454 1726882425.52388: done queuing things up, now waiting for results queue to drain 8454 1726882425.52390: waiting for pending results... 8454 1726882425.53155: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 8454 1726882425.53253: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b6 8454 1726882425.53257: variable 'ansible_search_path' from source: unknown 8454 1726882425.53259: variable 'ansible_search_path' from source: unknown 8454 1726882425.53265: calling self._execute() 8454 1726882425.53442: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882425.53456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882425.53577: variable 'omit' from source: magic vars 8454 1726882425.54382: variable 'ansible_distribution_major_version' from source: facts 8454 1726882425.54401: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882425.54715: variable 'profile_stat' from source: set_fact 8454 1726882425.54759: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882425.54782: when evaluation is False, skipping this task 8454 1726882425.54996: _execute() done 8454 1726882425.54999: dumping result to json 8454 1726882425.55002: done dumping result, returning 8454 1726882425.55004: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affe814-3a2d-f59f-16b9-0000000003b6] 8454 1726882425.55007: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b6 8454 1726882425.55078: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b6 8454 1726882425.55082: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882425.55161: no more pending results, returning what we have 8454 1726882425.55167: results queue empty 8454 1726882425.55168: checking for any_errors_fatal 8454 1726882425.55175: done checking for any_errors_fatal 8454 1726882425.55176: checking for max_fail_percentage 8454 1726882425.55178: done checking for max_fail_percentage 8454 1726882425.55179: checking to see if all hosts have failed and the running result is not ok 8454 1726882425.55180: done checking to see if all hosts have failed 8454 1726882425.55181: getting the remaining hosts for this loop 8454 1726882425.55183: done getting the remaining hosts for this loop 8454 1726882425.55188: getting the next task for host managed_node3 8454 1726882425.55196: done getting next task for host managed_node3 8454 1726882425.55199: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8454 1726882425.55205: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882425.55210: getting variables 8454 1726882425.55212: in VariableManager get_vars() 8454 1726882425.55260: Calling all_inventory to load vars for managed_node3 8454 1726882425.55264: Calling groups_inventory to load vars for managed_node3 8454 1726882425.55266: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882425.55282: Calling all_plugins_play to load vars for managed_node3 8454 1726882425.55286: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882425.55291: Calling groups_plugins_play to load vars for managed_node3 8454 1726882425.59975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882425.65337: done with get_vars() 8454 1726882425.65382: done getting variables 8454 1726882425.65463: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882425.65602: variable 'profile' from source: include params 8454 1726882425.65608: variable 'item' from source: include params 8454 1726882425.65689: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:45 -0400 (0:00:00.139) 0:00:23.674 ****** 8454 1726882425.65726: entering _queue_task() for managed_node3/set_fact 8454 1726882425.66358: worker is 1 (out of 1 available) 8454 1726882425.66367: exiting _queue_task() for managed_node3/set_fact 8454 1726882425.66381: done queuing things up, now waiting for results queue to drain 8454 1726882425.66382: waiting for pending results... 8454 1726882425.66461: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 8454 1726882425.66723: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b7 8454 1726882425.66728: variable 'ansible_search_path' from source: unknown 8454 1726882425.66731: variable 'ansible_search_path' from source: unknown 8454 1726882425.66736: calling self._execute() 8454 1726882425.66815: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882425.66839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882425.66856: variable 'omit' from source: magic vars 8454 1726882425.67289: variable 'ansible_distribution_major_version' from source: facts 8454 1726882425.67310: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882425.67697: variable 'profile_stat' from source: set_fact 8454 1726882425.67702: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882425.67705: when evaluation is False, skipping this task 8454 1726882425.67707: _execute() done 8454 1726882425.67710: dumping result to json 8454 1726882425.67712: done dumping result, returning 8454 1726882425.67715: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affe814-3a2d-f59f-16b9-0000000003b7] 8454 1726882425.67717: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b7 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882425.68020: no more pending results, returning what we have 8454 1726882425.68025: results queue empty 8454 1726882425.68027: checking for any_errors_fatal 8454 1726882425.68036: done checking for any_errors_fatal 8454 1726882425.68037: checking for max_fail_percentage 8454 1726882425.68040: done checking for max_fail_percentage 8454 1726882425.68041: checking to see if all hosts have failed and the running result is not ok 8454 1726882425.68042: done checking to see if all hosts have failed 8454 1726882425.68043: getting the remaining hosts for this loop 8454 1726882425.68045: done getting the remaining hosts for this loop 8454 1726882425.68051: getting the next task for host managed_node3 8454 1726882425.68060: done getting next task for host managed_node3 8454 1726882425.68063: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8454 1726882425.68069: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882425.68074: getting variables 8454 1726882425.68076: in VariableManager get_vars() 8454 1726882425.68128: Calling all_inventory to load vars for managed_node3 8454 1726882425.68132: Calling groups_inventory to load vars for managed_node3 8454 1726882425.68440: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882425.68458: Calling all_plugins_play to load vars for managed_node3 8454 1726882425.68462: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882425.68466: Calling groups_plugins_play to load vars for managed_node3 8454 1726882425.69352: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b7 8454 1726882425.69355: WORKER PROCESS EXITING 8454 1726882425.73106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882425.78754: done with get_vars() 8454 1726882425.78792: done getting variables 8454 1726882425.79073: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882425.79208: variable 'profile' from source: include params 8454 1726882425.79213: variable 'item' from source: include params 8454 1726882425.79493: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:45 -0400 (0:00:00.137) 0:00:23.812 ****** 8454 1726882425.79530: entering _queue_task() for managed_node3/command 8454 1726882425.80031: worker is 1 (out of 1 available) 8454 1726882425.80048: exiting _queue_task() for managed_node3/command 8454 1726882425.80062: done queuing things up, now waiting for results queue to drain 8454 1726882425.80064: waiting for pending results... 8454 1726882425.80374: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 8454 1726882425.80524: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b8 8454 1726882425.80549: variable 'ansible_search_path' from source: unknown 8454 1726882425.80558: variable 'ansible_search_path' from source: unknown 8454 1726882425.80609: calling self._execute() 8454 1726882425.80728: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882425.80744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882425.80762: variable 'omit' from source: magic vars 8454 1726882425.81220: variable 'ansible_distribution_major_version' from source: facts 8454 1726882425.81246: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882425.81410: variable 'profile_stat' from source: set_fact 8454 1726882425.81432: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882425.81453: when evaluation is False, skipping this task 8454 1726882425.81463: _execute() done 8454 1726882425.81552: dumping result to json 8454 1726882425.81560: done dumping result, returning 8454 1726882425.81563: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [0affe814-3a2d-f59f-16b9-0000000003b8] 8454 1726882425.81566: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b8 8454 1726882425.81639: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b8 8454 1726882425.81642: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882425.81719: no more pending results, returning what we have 8454 1726882425.81725: results queue empty 8454 1726882425.81726: checking for any_errors_fatal 8454 1726882425.81738: done checking for any_errors_fatal 8454 1726882425.81740: checking for max_fail_percentage 8454 1726882425.81742: done checking for max_fail_percentage 8454 1726882425.81743: checking to see if all hosts have failed and the running result is not ok 8454 1726882425.81744: done checking to see if all hosts have failed 8454 1726882425.81745: getting the remaining hosts for this loop 8454 1726882425.81748: done getting the remaining hosts for this loop 8454 1726882425.81753: getting the next task for host managed_node3 8454 1726882425.81761: done getting next task for host managed_node3 8454 1726882425.81765: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8454 1726882425.81770: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882425.81776: getting variables 8454 1726882425.81778: in VariableManager get_vars() 8454 1726882425.81826: Calling all_inventory to load vars for managed_node3 8454 1726882425.81830: Calling groups_inventory to load vars for managed_node3 8454 1726882425.81833: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882425.81958: Calling all_plugins_play to load vars for managed_node3 8454 1726882425.81963: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882425.81968: Calling groups_plugins_play to load vars for managed_node3 8454 1726882425.84347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882425.89889: done with get_vars() 8454 1726882425.89938: done getting variables 8454 1726882425.90009: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882425.90161: variable 'profile' from source: include params 8454 1726882425.90166: variable 'item' from source: include params 8454 1726882425.90241: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:45 -0400 (0:00:00.107) 0:00:23.920 ****** 8454 1726882425.90284: entering _queue_task() for managed_node3/set_fact 8454 1726882425.90642: worker is 1 (out of 1 available) 8454 1726882425.90654: exiting _queue_task() for managed_node3/set_fact 8454 1726882425.90669: done queuing things up, now waiting for results queue to drain 8454 1726882425.90671: waiting for pending results... 8454 1726882425.91153: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 8454 1726882425.91159: in run() - task 0affe814-3a2d-f59f-16b9-0000000003b9 8454 1726882425.91162: variable 'ansible_search_path' from source: unknown 8454 1726882425.91165: variable 'ansible_search_path' from source: unknown 8454 1726882425.91171: calling self._execute() 8454 1726882425.91290: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882425.91297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882425.91309: variable 'omit' from source: magic vars 8454 1726882425.92039: variable 'ansible_distribution_major_version' from source: facts 8454 1726882425.92066: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882425.92316: variable 'profile_stat' from source: set_fact 8454 1726882425.92340: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882425.92639: when evaluation is False, skipping this task 8454 1726882425.92642: _execute() done 8454 1726882425.92645: dumping result to json 8454 1726882425.92663: done dumping result, returning 8454 1726882425.92666: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affe814-3a2d-f59f-16b9-0000000003b9] 8454 1726882425.92668: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b9 8454 1726882425.92730: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003b9 8454 1726882425.92733: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882425.92798: no more pending results, returning what we have 8454 1726882425.92802: results queue empty 8454 1726882425.92803: checking for any_errors_fatal 8454 1726882425.92807: done checking for any_errors_fatal 8454 1726882425.92808: checking for max_fail_percentage 8454 1726882425.92811: done checking for max_fail_percentage 8454 1726882425.92812: checking to see if all hosts have failed and the running result is not ok 8454 1726882425.92813: done checking to see if all hosts have failed 8454 1726882425.92814: getting the remaining hosts for this loop 8454 1726882425.92815: done getting the remaining hosts for this loop 8454 1726882425.92819: getting the next task for host managed_node3 8454 1726882425.92829: done getting next task for host managed_node3 8454 1726882425.93038: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8454 1726882425.93042: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882425.93048: getting variables 8454 1726882425.93050: in VariableManager get_vars() 8454 1726882425.93093: Calling all_inventory to load vars for managed_node3 8454 1726882425.93097: Calling groups_inventory to load vars for managed_node3 8454 1726882425.93101: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882425.93112: Calling all_plugins_play to load vars for managed_node3 8454 1726882425.93116: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882425.93120: Calling groups_plugins_play to load vars for managed_node3 8454 1726882425.95528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882425.99716: done with get_vars() 8454 1726882425.99755: done getting variables 8454 1726882425.99823: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882425.99959: variable 'profile' from source: include params 8454 1726882425.99964: variable 'item' from source: include params 8454 1726882426.00036: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:45 -0400 (0:00:00.097) 0:00:24.018 ****** 8454 1726882426.00071: entering _queue_task() for managed_node3/assert 8454 1726882426.00558: worker is 1 (out of 1 available) 8454 1726882426.00571: exiting _queue_task() for managed_node3/assert 8454 1726882426.00585: done queuing things up, now waiting for results queue to drain 8454 1726882426.00587: waiting for pending results... 8454 1726882426.01087: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 8454 1726882426.01370: in run() - task 0affe814-3a2d-f59f-16b9-000000000260 8454 1726882426.01392: variable 'ansible_search_path' from source: unknown 8454 1726882426.01396: variable 'ansible_search_path' from source: unknown 8454 1726882426.01437: calling self._execute() 8454 1726882426.01658: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.01666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.01685: variable 'omit' from source: magic vars 8454 1726882426.02591: variable 'ansible_distribution_major_version' from source: facts 8454 1726882426.02595: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882426.02597: variable 'omit' from source: magic vars 8454 1726882426.02628: variable 'omit' from source: magic vars 8454 1726882426.02760: variable 'profile' from source: include params 8454 1726882426.02772: variable 'item' from source: include params 8454 1726882426.02860: variable 'item' from source: include params 8454 1726882426.02888: variable 'omit' from source: magic vars 8454 1726882426.02948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882426.03025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882426.03028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882426.03053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.03070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.03110: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882426.03119: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.03133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.03262: Set connection var ansible_connection to ssh 8454 1726882426.03351: Set connection var ansible_shell_executable to /bin/sh 8454 1726882426.03354: Set connection var ansible_timeout to 10 8454 1726882426.03356: Set connection var ansible_shell_type to sh 8454 1726882426.03358: Set connection var ansible_pipelining to False 8454 1726882426.03360: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882426.03362: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.03364: variable 'ansible_connection' from source: unknown 8454 1726882426.03366: variable 'ansible_module_compression' from source: unknown 8454 1726882426.03368: variable 'ansible_shell_type' from source: unknown 8454 1726882426.03370: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.03378: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.03387: variable 'ansible_pipelining' from source: unknown 8454 1726882426.03395: variable 'ansible_timeout' from source: unknown 8454 1726882426.03404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.03581: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882426.03603: variable 'omit' from source: magic vars 8454 1726882426.03646: starting attempt loop 8454 1726882426.03655: running the handler 8454 1726882426.03800: variable 'lsr_net_profile_exists' from source: set_fact 8454 1726882426.03942: Evaluated conditional (lsr_net_profile_exists): True 8454 1726882426.03946: handler run complete 8454 1726882426.03949: attempt loop complete, returning result 8454 1726882426.03951: _execute() done 8454 1726882426.03953: dumping result to json 8454 1726882426.03956: done dumping result, returning 8454 1726882426.03958: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [0affe814-3a2d-f59f-16b9-000000000260] 8454 1726882426.03961: sending task result for task 0affe814-3a2d-f59f-16b9-000000000260 8454 1726882426.04039: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000260 8454 1726882426.04043: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882426.04103: no more pending results, returning what we have 8454 1726882426.04107: results queue empty 8454 1726882426.04108: checking for any_errors_fatal 8454 1726882426.04114: done checking for any_errors_fatal 8454 1726882426.04116: checking for max_fail_percentage 8454 1726882426.04118: done checking for max_fail_percentage 8454 1726882426.04120: checking to see if all hosts have failed and the running result is not ok 8454 1726882426.04121: done checking to see if all hosts have failed 8454 1726882426.04122: getting the remaining hosts for this loop 8454 1726882426.04123: done getting the remaining hosts for this loop 8454 1726882426.04128: getting the next task for host managed_node3 8454 1726882426.04137: done getting next task for host managed_node3 8454 1726882426.04140: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8454 1726882426.04145: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882426.04148: getting variables 8454 1726882426.04150: in VariableManager get_vars() 8454 1726882426.04199: Calling all_inventory to load vars for managed_node3 8454 1726882426.04203: Calling groups_inventory to load vars for managed_node3 8454 1726882426.04206: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882426.04220: Calling all_plugins_play to load vars for managed_node3 8454 1726882426.04224: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882426.04228: Calling groups_plugins_play to load vars for managed_node3 8454 1726882426.12358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882426.15199: done with get_vars() 8454 1726882426.15247: done getting variables 8454 1726882426.15310: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882426.15442: variable 'profile' from source: include params 8454 1726882426.15446: variable 'item' from source: include params 8454 1726882426.15520: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:46 -0400 (0:00:00.154) 0:00:24.173 ****** 8454 1726882426.15563: entering _queue_task() for managed_node3/assert 8454 1726882426.15929: worker is 1 (out of 1 available) 8454 1726882426.15951: exiting _queue_task() for managed_node3/assert 8454 1726882426.15967: done queuing things up, now waiting for results queue to drain 8454 1726882426.15968: waiting for pending results... 8454 1726882426.16254: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 8454 1726882426.16348: in run() - task 0affe814-3a2d-f59f-16b9-000000000261 8454 1726882426.16365: variable 'ansible_search_path' from source: unknown 8454 1726882426.16369: variable 'ansible_search_path' from source: unknown 8454 1726882426.16405: calling self._execute() 8454 1726882426.16486: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.16496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.16504: variable 'omit' from source: magic vars 8454 1726882426.16819: variable 'ansible_distribution_major_version' from source: facts 8454 1726882426.16831: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882426.16840: variable 'omit' from source: magic vars 8454 1726882426.16873: variable 'omit' from source: magic vars 8454 1726882426.16958: variable 'profile' from source: include params 8454 1726882426.16963: variable 'item' from source: include params 8454 1726882426.17016: variable 'item' from source: include params 8454 1726882426.17033: variable 'omit' from source: magic vars 8454 1726882426.17074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882426.17106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882426.17123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882426.17141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.17154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.17184: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882426.17189: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.17192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.17275: Set connection var ansible_connection to ssh 8454 1726882426.17285: Set connection var ansible_shell_executable to /bin/sh 8454 1726882426.17293: Set connection var ansible_timeout to 10 8454 1726882426.17297: Set connection var ansible_shell_type to sh 8454 1726882426.17306: Set connection var ansible_pipelining to False 8454 1726882426.17312: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882426.17331: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.17336: variable 'ansible_connection' from source: unknown 8454 1726882426.17339: variable 'ansible_module_compression' from source: unknown 8454 1726882426.17344: variable 'ansible_shell_type' from source: unknown 8454 1726882426.17347: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.17350: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.17356: variable 'ansible_pipelining' from source: unknown 8454 1726882426.17359: variable 'ansible_timeout' from source: unknown 8454 1726882426.17368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.17485: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882426.17494: variable 'omit' from source: magic vars 8454 1726882426.17499: starting attempt loop 8454 1726882426.17503: running the handler 8454 1726882426.17594: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8454 1726882426.17599: Evaluated conditional (lsr_net_profile_ansible_managed): True 8454 1726882426.17606: handler run complete 8454 1726882426.17620: attempt loop complete, returning result 8454 1726882426.17623: _execute() done 8454 1726882426.17627: dumping result to json 8454 1726882426.17630: done dumping result, returning 8454 1726882426.17641: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [0affe814-3a2d-f59f-16b9-000000000261] 8454 1726882426.17647: sending task result for task 0affe814-3a2d-f59f-16b9-000000000261 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882426.17793: no more pending results, returning what we have 8454 1726882426.17797: results queue empty 8454 1726882426.17798: checking for any_errors_fatal 8454 1726882426.17805: done checking for any_errors_fatal 8454 1726882426.17806: checking for max_fail_percentage 8454 1726882426.17808: done checking for max_fail_percentage 8454 1726882426.17809: checking to see if all hosts have failed and the running result is not ok 8454 1726882426.17810: done checking to see if all hosts have failed 8454 1726882426.17811: getting the remaining hosts for this loop 8454 1726882426.17813: done getting the remaining hosts for this loop 8454 1726882426.17818: getting the next task for host managed_node3 8454 1726882426.17824: done getting next task for host managed_node3 8454 1726882426.17827: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8454 1726882426.17830: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882426.17835: getting variables 8454 1726882426.17837: in VariableManager get_vars() 8454 1726882426.17880: Calling all_inventory to load vars for managed_node3 8454 1726882426.17884: Calling groups_inventory to load vars for managed_node3 8454 1726882426.17887: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882426.17897: Calling all_plugins_play to load vars for managed_node3 8454 1726882426.17900: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882426.17910: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000261 8454 1726882426.17913: WORKER PROCESS EXITING 8454 1726882426.17947: Calling groups_plugins_play to load vars for managed_node3 8454 1726882426.19630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882426.21210: done with get_vars() 8454 1726882426.21232: done getting variables 8454 1726882426.21309: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882426.21427: variable 'profile' from source: include params 8454 1726882426.21432: variable 'item' from source: include params 8454 1726882426.21504: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:46 -0400 (0:00:00.059) 0:00:24.233 ****** 8454 1726882426.21544: entering _queue_task() for managed_node3/assert 8454 1726882426.21822: worker is 1 (out of 1 available) 8454 1726882426.22038: exiting _queue_task() for managed_node3/assert 8454 1726882426.22052: done queuing things up, now waiting for results queue to drain 8454 1726882426.22054: waiting for pending results... 8454 1726882426.22159: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 8454 1726882426.22271: in run() - task 0affe814-3a2d-f59f-16b9-000000000262 8454 1726882426.22294: variable 'ansible_search_path' from source: unknown 8454 1726882426.22298: variable 'ansible_search_path' from source: unknown 8454 1726882426.22339: calling self._execute() 8454 1726882426.22441: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.22448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.22461: variable 'omit' from source: magic vars 8454 1726882426.22908: variable 'ansible_distribution_major_version' from source: facts 8454 1726882426.22921: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882426.22928: variable 'omit' from source: magic vars 8454 1726882426.22994: variable 'omit' from source: magic vars 8454 1726882426.23116: variable 'profile' from source: include params 8454 1726882426.23120: variable 'item' from source: include params 8454 1726882426.23305: variable 'item' from source: include params 8454 1726882426.23309: variable 'omit' from source: magic vars 8454 1726882426.23311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882426.23315: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882426.23342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882426.23364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.23386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.23422: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882426.23433: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.23438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.23563: Set connection var ansible_connection to ssh 8454 1726882426.23640: Set connection var ansible_shell_executable to /bin/sh 8454 1726882426.23644: Set connection var ansible_timeout to 10 8454 1726882426.23654: Set connection var ansible_shell_type to sh 8454 1726882426.23657: Set connection var ansible_pipelining to False 8454 1726882426.23660: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882426.23662: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.23664: variable 'ansible_connection' from source: unknown 8454 1726882426.23666: variable 'ansible_module_compression' from source: unknown 8454 1726882426.23668: variable 'ansible_shell_type' from source: unknown 8454 1726882426.23671: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.23673: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.23675: variable 'ansible_pipelining' from source: unknown 8454 1726882426.23677: variable 'ansible_timeout' from source: unknown 8454 1726882426.23680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.23839: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882426.23851: variable 'omit' from source: magic vars 8454 1726882426.23857: starting attempt loop 8454 1726882426.23860: running the handler 8454 1726882426.24089: variable 'lsr_net_profile_fingerprint' from source: set_fact 8454 1726882426.24093: Evaluated conditional (lsr_net_profile_fingerprint): True 8454 1726882426.24096: handler run complete 8454 1726882426.24098: attempt loop complete, returning result 8454 1726882426.24100: _execute() done 8454 1726882426.24102: dumping result to json 8454 1726882426.24104: done dumping result, returning 8454 1726882426.24106: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [0affe814-3a2d-f59f-16b9-000000000262] 8454 1726882426.24108: sending task result for task 0affe814-3a2d-f59f-16b9-000000000262 8454 1726882426.24182: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000262 8454 1726882426.24186: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882426.24241: no more pending results, returning what we have 8454 1726882426.24245: results queue empty 8454 1726882426.24246: checking for any_errors_fatal 8454 1726882426.24253: done checking for any_errors_fatal 8454 1726882426.24254: checking for max_fail_percentage 8454 1726882426.24256: done checking for max_fail_percentage 8454 1726882426.24258: checking to see if all hosts have failed and the running result is not ok 8454 1726882426.24259: done checking to see if all hosts have failed 8454 1726882426.24260: getting the remaining hosts for this loop 8454 1726882426.24262: done getting the remaining hosts for this loop 8454 1726882426.24266: getting the next task for host managed_node3 8454 1726882426.24280: done getting next task for host managed_node3 8454 1726882426.24283: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8454 1726882426.24287: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882426.24291: getting variables 8454 1726882426.24293: in VariableManager get_vars() 8454 1726882426.24339: Calling all_inventory to load vars for managed_node3 8454 1726882426.24343: Calling groups_inventory to load vars for managed_node3 8454 1726882426.24346: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882426.24360: Calling all_plugins_play to load vars for managed_node3 8454 1726882426.24364: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882426.24368: Calling groups_plugins_play to load vars for managed_node3 8454 1726882426.26795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882426.29667: done with get_vars() 8454 1726882426.29704: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:46 -0400 (0:00:00.082) 0:00:24.315 ****** 8454 1726882426.29816: entering _queue_task() for managed_node3/include_tasks 8454 1726882426.30133: worker is 1 (out of 1 available) 8454 1726882426.30148: exiting _queue_task() for managed_node3/include_tasks 8454 1726882426.30163: done queuing things up, now waiting for results queue to drain 8454 1726882426.30165: waiting for pending results... 8454 1726882426.30510: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 8454 1726882426.30601: in run() - task 0affe814-3a2d-f59f-16b9-000000000266 8454 1726882426.30644: variable 'ansible_search_path' from source: unknown 8454 1726882426.30647: variable 'ansible_search_path' from source: unknown 8454 1726882426.30659: calling self._execute() 8454 1726882426.30771: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.30818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.30826: variable 'omit' from source: magic vars 8454 1726882426.31237: variable 'ansible_distribution_major_version' from source: facts 8454 1726882426.31251: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882426.31258: _execute() done 8454 1726882426.31263: dumping result to json 8454 1726882426.31267: done dumping result, returning 8454 1726882426.31339: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-f59f-16b9-000000000266] 8454 1726882426.31343: sending task result for task 0affe814-3a2d-f59f-16b9-000000000266 8454 1726882426.31453: no more pending results, returning what we have 8454 1726882426.31460: in VariableManager get_vars() 8454 1726882426.31516: Calling all_inventory to load vars for managed_node3 8454 1726882426.31520: Calling groups_inventory to load vars for managed_node3 8454 1726882426.31523: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882426.31543: Calling all_plugins_play to load vars for managed_node3 8454 1726882426.31547: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882426.31552: Calling groups_plugins_play to load vars for managed_node3 8454 1726882426.32072: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000266 8454 1726882426.32077: WORKER PROCESS EXITING 8454 1726882426.34070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882426.37142: done with get_vars() 8454 1726882426.37175: variable 'ansible_search_path' from source: unknown 8454 1726882426.37177: variable 'ansible_search_path' from source: unknown 8454 1726882426.37221: we have included files to process 8454 1726882426.37222: generating all_blocks data 8454 1726882426.37225: done generating all_blocks data 8454 1726882426.37230: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882426.37232: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882426.37400: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882426.38696: done processing included file 8454 1726882426.38699: iterating over new_blocks loaded from include file 8454 1726882426.38701: in VariableManager get_vars() 8454 1726882426.38737: done with get_vars() 8454 1726882426.38739: filtering new block on tags 8454 1726882426.38774: done filtering new block on tags 8454 1726882426.38778: in VariableManager get_vars() 8454 1726882426.38805: done with get_vars() 8454 1726882426.38808: filtering new block on tags 8454 1726882426.38850: done filtering new block on tags 8454 1726882426.38854: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 8454 1726882426.38860: extending task lists for all hosts with included blocks 8454 1726882426.39123: done extending task lists 8454 1726882426.39125: done processing included files 8454 1726882426.39126: results queue empty 8454 1726882426.39127: checking for any_errors_fatal 8454 1726882426.39131: done checking for any_errors_fatal 8454 1726882426.39132: checking for max_fail_percentage 8454 1726882426.39133: done checking for max_fail_percentage 8454 1726882426.39139: checking to see if all hosts have failed and the running result is not ok 8454 1726882426.39141: done checking to see if all hosts have failed 8454 1726882426.39142: getting the remaining hosts for this loop 8454 1726882426.39143: done getting the remaining hosts for this loop 8454 1726882426.39151: getting the next task for host managed_node3 8454 1726882426.39157: done getting next task for host managed_node3 8454 1726882426.39160: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8454 1726882426.39164: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882426.39167: getting variables 8454 1726882426.39168: in VariableManager get_vars() 8454 1726882426.39184: Calling all_inventory to load vars for managed_node3 8454 1726882426.39188: Calling groups_inventory to load vars for managed_node3 8454 1726882426.39191: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882426.39197: Calling all_plugins_play to load vars for managed_node3 8454 1726882426.39200: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882426.39204: Calling groups_plugins_play to load vars for managed_node3 8454 1726882426.41674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882426.44510: done with get_vars() 8454 1726882426.44532: done getting variables 8454 1726882426.44567: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:46 -0400 (0:00:00.147) 0:00:24.463 ****** 8454 1726882426.44594: entering _queue_task() for managed_node3/set_fact 8454 1726882426.44901: worker is 1 (out of 1 available) 8454 1726882426.44912: exiting _queue_task() for managed_node3/set_fact 8454 1726882426.44925: done queuing things up, now waiting for results queue to drain 8454 1726882426.44927: waiting for pending results... 8454 1726882426.45188: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 8454 1726882426.45349: in run() - task 0affe814-3a2d-f59f-16b9-0000000003f8 8454 1726882426.45353: variable 'ansible_search_path' from source: unknown 8454 1726882426.45357: variable 'ansible_search_path' from source: unknown 8454 1726882426.45361: calling self._execute() 8454 1726882426.45450: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.45460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.45470: variable 'omit' from source: magic vars 8454 1726882426.45909: variable 'ansible_distribution_major_version' from source: facts 8454 1726882426.45924: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882426.45930: variable 'omit' from source: magic vars 8454 1726882426.46006: variable 'omit' from source: magic vars 8454 1726882426.46047: variable 'omit' from source: magic vars 8454 1726882426.46116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882426.46142: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882426.46167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882426.46187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.46203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.46244: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882426.46249: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.46252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.46356: Set connection var ansible_connection to ssh 8454 1726882426.46376: Set connection var ansible_shell_executable to /bin/sh 8454 1726882426.46379: Set connection var ansible_timeout to 10 8454 1726882426.46487: Set connection var ansible_shell_type to sh 8454 1726882426.46491: Set connection var ansible_pipelining to False 8454 1726882426.46493: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882426.46496: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.46498: variable 'ansible_connection' from source: unknown 8454 1726882426.46501: variable 'ansible_module_compression' from source: unknown 8454 1726882426.46503: variable 'ansible_shell_type' from source: unknown 8454 1726882426.46505: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.46508: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.46510: variable 'ansible_pipelining' from source: unknown 8454 1726882426.46513: variable 'ansible_timeout' from source: unknown 8454 1726882426.46515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.46693: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882426.46697: variable 'omit' from source: magic vars 8454 1726882426.46700: starting attempt loop 8454 1726882426.46702: running the handler 8454 1726882426.46705: handler run complete 8454 1726882426.46722: attempt loop complete, returning result 8454 1726882426.46726: _execute() done 8454 1726882426.46728: dumping result to json 8454 1726882426.46731: done dumping result, returning 8454 1726882426.46736: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-f59f-16b9-0000000003f8] 8454 1726882426.46738: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003f8 8454 1726882426.46997: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003f8 8454 1726882426.47000: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8454 1726882426.47064: no more pending results, returning what we have 8454 1726882426.47068: results queue empty 8454 1726882426.47069: checking for any_errors_fatal 8454 1726882426.47070: done checking for any_errors_fatal 8454 1726882426.47071: checking for max_fail_percentage 8454 1726882426.47073: done checking for max_fail_percentage 8454 1726882426.47074: checking to see if all hosts have failed and the running result is not ok 8454 1726882426.47075: done checking to see if all hosts have failed 8454 1726882426.47076: getting the remaining hosts for this loop 8454 1726882426.47077: done getting the remaining hosts for this loop 8454 1726882426.47081: getting the next task for host managed_node3 8454 1726882426.47088: done getting next task for host managed_node3 8454 1726882426.47090: ^ task is: TASK: Stat profile file 8454 1726882426.47094: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882426.47098: getting variables 8454 1726882426.47100: in VariableManager get_vars() 8454 1726882426.47149: Calling all_inventory to load vars for managed_node3 8454 1726882426.47152: Calling groups_inventory to load vars for managed_node3 8454 1726882426.47155: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882426.47166: Calling all_plugins_play to load vars for managed_node3 8454 1726882426.47170: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882426.47174: Calling groups_plugins_play to load vars for managed_node3 8454 1726882426.48674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882426.50661: done with get_vars() 8454 1726882426.50697: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:46 -0400 (0:00:00.061) 0:00:24.525 ****** 8454 1726882426.50793: entering _queue_task() for managed_node3/stat 8454 1726882426.51020: worker is 1 (out of 1 available) 8454 1726882426.51035: exiting _queue_task() for managed_node3/stat 8454 1726882426.51050: done queuing things up, now waiting for results queue to drain 8454 1726882426.51051: waiting for pending results... 8454 1726882426.51231: running TaskExecutor() for managed_node3/TASK: Stat profile file 8454 1726882426.51323: in run() - task 0affe814-3a2d-f59f-16b9-0000000003f9 8454 1726882426.51337: variable 'ansible_search_path' from source: unknown 8454 1726882426.51341: variable 'ansible_search_path' from source: unknown 8454 1726882426.51373: calling self._execute() 8454 1726882426.51450: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.51456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.51466: variable 'omit' from source: magic vars 8454 1726882426.51791: variable 'ansible_distribution_major_version' from source: facts 8454 1726882426.51802: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882426.51808: variable 'omit' from source: magic vars 8454 1726882426.51851: variable 'omit' from source: magic vars 8454 1726882426.51929: variable 'profile' from source: include params 8454 1726882426.51937: variable 'item' from source: include params 8454 1726882426.51996: variable 'item' from source: include params 8454 1726882426.52011: variable 'omit' from source: magic vars 8454 1726882426.52052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882426.52087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882426.52104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882426.52119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.52130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882426.52162: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882426.52167: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.52170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.52254: Set connection var ansible_connection to ssh 8454 1726882426.52263: Set connection var ansible_shell_executable to /bin/sh 8454 1726882426.52272: Set connection var ansible_timeout to 10 8454 1726882426.52276: Set connection var ansible_shell_type to sh 8454 1726882426.52289: Set connection var ansible_pipelining to False 8454 1726882426.52295: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882426.52314: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.52317: variable 'ansible_connection' from source: unknown 8454 1726882426.52320: variable 'ansible_module_compression' from source: unknown 8454 1726882426.52324: variable 'ansible_shell_type' from source: unknown 8454 1726882426.52327: variable 'ansible_shell_executable' from source: unknown 8454 1726882426.52332: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.52338: variable 'ansible_pipelining' from source: unknown 8454 1726882426.52341: variable 'ansible_timeout' from source: unknown 8454 1726882426.52347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.52516: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882426.52526: variable 'omit' from source: magic vars 8454 1726882426.52532: starting attempt loop 8454 1726882426.52536: running the handler 8454 1726882426.52549: _low_level_execute_command(): starting 8454 1726882426.52557: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882426.53076: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882426.53083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882426.53087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.53126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882426.53137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882426.53273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882426.55115: stdout chunk (state=3): >>>/root <<< 8454 1726882426.55222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882426.55272: stderr chunk (state=3): >>><<< 8454 1726882426.55275: stdout chunk (state=3): >>><<< 8454 1726882426.55296: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882426.55309: _low_level_execute_command(): starting 8454 1726882426.55315: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465 `" && echo ansible-tmp-1726882426.5529702-9321-254877709400465="` echo /root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465 `" ) && sleep 0' 8454 1726882426.55731: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882426.55766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882426.55770: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.55781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882426.55786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882426.55789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.55832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882426.55838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882426.55961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882426.58027: stdout chunk (state=3): >>>ansible-tmp-1726882426.5529702-9321-254877709400465=/root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465 <<< 8454 1726882426.58145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882426.58191: stderr chunk (state=3): >>><<< 8454 1726882426.58195: stdout chunk (state=3): >>><<< 8454 1726882426.58210: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882426.5529702-9321-254877709400465=/root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882426.58245: variable 'ansible_module_compression' from source: unknown 8454 1726882426.58293: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8454 1726882426.58323: variable 'ansible_facts' from source: unknown 8454 1726882426.58382: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/AnsiballZ_stat.py 8454 1726882426.58472: Sending initial data 8454 1726882426.58476: Sent initial data (151 bytes) 8454 1726882426.58915: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882426.58920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.58923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882426.58925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.58984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882426.58987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882426.59100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882426.60776: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8454 1726882426.60785: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882426.60887: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882426.61006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpn9kdolda /root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/AnsiballZ_stat.py <<< 8454 1726882426.61010: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/AnsiballZ_stat.py" <<< 8454 1726882426.61109: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpn9kdolda" to remote "/root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/AnsiballZ_stat.py" <<< 8454 1726882426.62175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882426.62236: stderr chunk (state=3): >>><<< 8454 1726882426.62240: stdout chunk (state=3): >>><<< 8454 1726882426.62258: done transferring module to remote 8454 1726882426.62268: _low_level_execute_command(): starting 8454 1726882426.62273: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/ /root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/AnsiballZ_stat.py && sleep 0' 8454 1726882426.62694: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882426.62697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882426.62700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882426.62702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882426.62704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.62760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882426.62766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882426.62882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882426.64764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882426.64806: stderr chunk (state=3): >>><<< 8454 1726882426.64811: stdout chunk (state=3): >>><<< 8454 1726882426.64829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882426.64832: _low_level_execute_command(): starting 8454 1726882426.64839: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/AnsiballZ_stat.py && sleep 0' 8454 1726882426.65294: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882426.65297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.65299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882426.65302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882426.65304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.65353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882426.65360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882426.65479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882426.82884: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8454 1726882426.84156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882426.84309: stderr chunk (state=3): >>><<< 8454 1726882426.84313: stdout chunk (state=3): >>><<< 8454 1726882426.84336: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882426.84442: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882426.84455: _low_level_execute_command(): starting 8454 1726882426.84461: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882426.5529702-9321-254877709400465/ > /dev/null 2>&1 && sleep 0' 8454 1726882426.85757: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882426.85851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.85867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882426.85876: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882426.85961: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882426.86075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882426.86140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882426.86147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882426.86286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882426.88550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882426.88553: stdout chunk (state=3): >>><<< 8454 1726882426.88562: stderr chunk (state=3): >>><<< 8454 1726882426.88744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882426.88748: handler run complete 8454 1726882426.88750: attempt loop complete, returning result 8454 1726882426.88753: _execute() done 8454 1726882426.88755: dumping result to json 8454 1726882426.88757: done dumping result, returning 8454 1726882426.88759: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affe814-3a2d-f59f-16b9-0000000003f9] 8454 1726882426.88761: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003f9 8454 1726882426.88843: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003f9 8454 1726882426.88847: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8454 1726882426.88933: no more pending results, returning what we have 8454 1726882426.88941: results queue empty 8454 1726882426.88942: checking for any_errors_fatal 8454 1726882426.88951: done checking for any_errors_fatal 8454 1726882426.88952: checking for max_fail_percentage 8454 1726882426.88954: done checking for max_fail_percentage 8454 1726882426.88955: checking to see if all hosts have failed and the running result is not ok 8454 1726882426.88956: done checking to see if all hosts have failed 8454 1726882426.88957: getting the remaining hosts for this loop 8454 1726882426.88959: done getting the remaining hosts for this loop 8454 1726882426.88964: getting the next task for host managed_node3 8454 1726882426.88974: done getting next task for host managed_node3 8454 1726882426.88978: ^ task is: TASK: Set NM profile exist flag based on the profile files 8454 1726882426.88983: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882426.88988: getting variables 8454 1726882426.88991: in VariableManager get_vars() 8454 1726882426.89345: Calling all_inventory to load vars for managed_node3 8454 1726882426.89349: Calling groups_inventory to load vars for managed_node3 8454 1726882426.89352: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882426.89365: Calling all_plugins_play to load vars for managed_node3 8454 1726882426.89368: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882426.89372: Calling groups_plugins_play to load vars for managed_node3 8454 1726882426.92885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882426.96541: done with get_vars() 8454 1726882426.96578: done getting variables 8454 1726882426.96857: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:46 -0400 (0:00:00.460) 0:00:24.986 ****** 8454 1726882426.96895: entering _queue_task() for managed_node3/set_fact 8454 1726882426.97589: worker is 1 (out of 1 available) 8454 1726882426.97603: exiting _queue_task() for managed_node3/set_fact 8454 1726882426.97619: done queuing things up, now waiting for results queue to drain 8454 1726882426.97620: waiting for pending results... 8454 1726882426.98152: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 8454 1726882426.98443: in run() - task 0affe814-3a2d-f59f-16b9-0000000003fa 8454 1726882426.98447: variable 'ansible_search_path' from source: unknown 8454 1726882426.98450: variable 'ansible_search_path' from source: unknown 8454 1726882426.98454: calling self._execute() 8454 1726882426.98523: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882426.98529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882426.98709: variable 'omit' from source: magic vars 8454 1726882426.99360: variable 'ansible_distribution_major_version' from source: facts 8454 1726882426.99365: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882426.99451: variable 'profile_stat' from source: set_fact 8454 1726882426.99472: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882426.99475: when evaluation is False, skipping this task 8454 1726882426.99478: _execute() done 8454 1726882426.99487: dumping result to json 8454 1726882426.99490: done dumping result, returning 8454 1726882426.99499: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-f59f-16b9-0000000003fa] 8454 1726882426.99506: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003fa skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882426.99660: no more pending results, returning what we have 8454 1726882426.99664: results queue empty 8454 1726882426.99665: checking for any_errors_fatal 8454 1726882426.99677: done checking for any_errors_fatal 8454 1726882426.99678: checking for max_fail_percentage 8454 1726882426.99680: done checking for max_fail_percentage 8454 1726882426.99681: checking to see if all hosts have failed and the running result is not ok 8454 1726882426.99682: done checking to see if all hosts have failed 8454 1726882426.99683: getting the remaining hosts for this loop 8454 1726882426.99685: done getting the remaining hosts for this loop 8454 1726882426.99690: getting the next task for host managed_node3 8454 1726882426.99699: done getting next task for host managed_node3 8454 1726882426.99702: ^ task is: TASK: Get NM profile info 8454 1726882426.99707: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882426.99711: getting variables 8454 1726882426.99713: in VariableManager get_vars() 8454 1726882426.99757: Calling all_inventory to load vars for managed_node3 8454 1726882426.99760: Calling groups_inventory to load vars for managed_node3 8454 1726882426.99763: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882426.99775: Calling all_plugins_play to load vars for managed_node3 8454 1726882426.99778: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882426.99781: Calling groups_plugins_play to load vars for managed_node3 8454 1726882427.00952: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003fa 8454 1726882427.01769: WORKER PROCESS EXITING 8454 1726882427.04120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882427.08636: done with get_vars() 8454 1726882427.08675: done getting variables 8454 1726882427.08752: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:47 -0400 (0:00:00.118) 0:00:25.105 ****** 8454 1726882427.08790: entering _queue_task() for managed_node3/shell 8454 1726882427.09201: worker is 1 (out of 1 available) 8454 1726882427.09214: exiting _queue_task() for managed_node3/shell 8454 1726882427.09230: done queuing things up, now waiting for results queue to drain 8454 1726882427.09231: waiting for pending results... 8454 1726882427.09551: running TaskExecutor() for managed_node3/TASK: Get NM profile info 8454 1726882427.09721: in run() - task 0affe814-3a2d-f59f-16b9-0000000003fb 8454 1726882427.09752: variable 'ansible_search_path' from source: unknown 8454 1726882427.09762: variable 'ansible_search_path' from source: unknown 8454 1726882427.09838: calling self._execute() 8454 1726882427.09950: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.09968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.09987: variable 'omit' from source: magic vars 8454 1726882427.10566: variable 'ansible_distribution_major_version' from source: facts 8454 1726882427.10588: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882427.10600: variable 'omit' from source: magic vars 8454 1726882427.10679: variable 'omit' from source: magic vars 8454 1726882427.10816: variable 'profile' from source: include params 8454 1726882427.10826: variable 'item' from source: include params 8454 1726882427.10920: variable 'item' from source: include params 8454 1726882427.10951: variable 'omit' from source: magic vars 8454 1726882427.11012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882427.11061: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882427.11095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882427.11181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882427.11185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882427.11188: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882427.11193: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.11203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.11361: Set connection var ansible_connection to ssh 8454 1726882427.11485: Set connection var ansible_shell_executable to /bin/sh 8454 1726882427.11839: Set connection var ansible_timeout to 10 8454 1726882427.11842: Set connection var ansible_shell_type to sh 8454 1726882427.11845: Set connection var ansible_pipelining to False 8454 1726882427.11849: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882427.11851: variable 'ansible_shell_executable' from source: unknown 8454 1726882427.11853: variable 'ansible_connection' from source: unknown 8454 1726882427.11855: variable 'ansible_module_compression' from source: unknown 8454 1726882427.11857: variable 'ansible_shell_type' from source: unknown 8454 1726882427.11859: variable 'ansible_shell_executable' from source: unknown 8454 1726882427.11861: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.11863: variable 'ansible_pipelining' from source: unknown 8454 1726882427.11866: variable 'ansible_timeout' from source: unknown 8454 1726882427.11868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.12016: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882427.12108: variable 'omit' from source: magic vars 8454 1726882427.12120: starting attempt loop 8454 1726882427.12128: running the handler 8454 1726882427.12147: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882427.12173: _low_level_execute_command(): starting 8454 1726882427.12212: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882427.13701: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882427.13746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882427.14050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882427.14140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882427.16023: stdout chunk (state=3): >>>/root <<< 8454 1726882427.16259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882427.16452: stderr chunk (state=3): >>><<< 8454 1726882427.16456: stdout chunk (state=3): >>><<< 8454 1726882427.16459: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882427.16461: _low_level_execute_command(): starting 8454 1726882427.16464: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275 `" && echo ansible-tmp-1726882427.1637988-9337-145710582144275="` echo /root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275 `" ) && sleep 0' 8454 1726882427.17684: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882427.17689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882427.17698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882427.17701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882427.17987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882427.18101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882427.20353: stdout chunk (state=3): >>>ansible-tmp-1726882427.1637988-9337-145710582144275=/root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275 <<< 8454 1726882427.20561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882427.20564: stdout chunk (state=3): >>><<< 8454 1726882427.20566: stderr chunk (state=3): >>><<< 8454 1726882427.20569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882427.1637988-9337-145710582144275=/root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882427.20572: variable 'ansible_module_compression' from source: unknown 8454 1726882427.20596: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882427.20638: variable 'ansible_facts' from source: unknown 8454 1726882427.20920: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/AnsiballZ_command.py 8454 1726882427.21321: Sending initial data 8454 1726882427.21324: Sent initial data (154 bytes) 8454 1726882427.22669: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882427.22687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882427.22700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882427.22852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882427.24648: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882427.24760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882427.24879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp7ix3yt7a /root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/AnsiballZ_command.py <<< 8454 1726882427.24892: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/AnsiballZ_command.py" <<< 8454 1726882427.24994: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp7ix3yt7a" to remote "/root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/AnsiballZ_command.py" <<< 8454 1726882427.27313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882427.27348: stderr chunk (state=3): >>><<< 8454 1726882427.27358: stdout chunk (state=3): >>><<< 8454 1726882427.27392: done transferring module to remote 8454 1726882427.27411: _low_level_execute_command(): starting 8454 1726882427.27422: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/ /root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/AnsiballZ_command.py && sleep 0' 8454 1726882427.28366: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882427.28369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882427.28372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882427.28375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882427.28377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882427.28435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882427.28558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882427.28561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882427.28604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882427.30717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882427.30721: stdout chunk (state=3): >>><<< 8454 1726882427.30729: stderr chunk (state=3): >>><<< 8454 1726882427.30888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882427.30893: _low_level_execute_command(): starting 8454 1726882427.30898: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/AnsiballZ_command.py && sleep 0' 8454 1726882427.31841: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882427.31919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882427.31953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882427.32077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882427.52065: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:47.495609", "end": "2024-09-20 21:33:47.518610", "delta": "0:00:00.023001", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882427.53757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882427.53809: stderr chunk (state=3): >>><<< 8454 1726882427.53812: stdout chunk (state=3): >>><<< 8454 1726882427.53829: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:47.495609", "end": "2024-09-20 21:33:47.518610", "delta": "0:00:00.023001", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882427.53869: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882427.53879: _low_level_execute_command(): starting 8454 1726882427.53883: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882427.1637988-9337-145710582144275/ > /dev/null 2>&1 && sleep 0' 8454 1726882427.54306: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882427.54349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882427.54353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882427.54356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882427.54362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882427.54364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882427.54407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882427.54411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882427.54529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882427.56528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882427.56571: stderr chunk (state=3): >>><<< 8454 1726882427.56574: stdout chunk (state=3): >>><<< 8454 1726882427.56591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882427.56604: handler run complete 8454 1726882427.56622: Evaluated conditional (False): False 8454 1726882427.56635: attempt loop complete, returning result 8454 1726882427.56638: _execute() done 8454 1726882427.56641: dumping result to json 8454 1726882427.56647: done dumping result, returning 8454 1726882427.56655: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affe814-3a2d-f59f-16b9-0000000003fb] 8454 1726882427.56661: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003fb 8454 1726882427.56769: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003fb 8454 1726882427.56772: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.023001", "end": "2024-09-20 21:33:47.518610", "rc": 0, "start": "2024-09-20 21:33:47.495609" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 8454 1726882427.56860: no more pending results, returning what we have 8454 1726882427.56864: results queue empty 8454 1726882427.56865: checking for any_errors_fatal 8454 1726882427.56871: done checking for any_errors_fatal 8454 1726882427.56872: checking for max_fail_percentage 8454 1726882427.56874: done checking for max_fail_percentage 8454 1726882427.56876: checking to see if all hosts have failed and the running result is not ok 8454 1726882427.56876: done checking to see if all hosts have failed 8454 1726882427.56877: getting the remaining hosts for this loop 8454 1726882427.56880: done getting the remaining hosts for this loop 8454 1726882427.56893: getting the next task for host managed_node3 8454 1726882427.56900: done getting next task for host managed_node3 8454 1726882427.56903: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8454 1726882427.56908: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882427.56912: getting variables 8454 1726882427.56913: in VariableManager get_vars() 8454 1726882427.56955: Calling all_inventory to load vars for managed_node3 8454 1726882427.56958: Calling groups_inventory to load vars for managed_node3 8454 1726882427.56961: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882427.56972: Calling all_plugins_play to load vars for managed_node3 8454 1726882427.56975: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882427.56978: Calling groups_plugins_play to load vars for managed_node3 8454 1726882427.58310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882427.59840: done with get_vars() 8454 1726882427.59861: done getting variables 8454 1726882427.59911: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:47 -0400 (0:00:00.511) 0:00:25.617 ****** 8454 1726882427.59940: entering _queue_task() for managed_node3/set_fact 8454 1726882427.60158: worker is 1 (out of 1 available) 8454 1726882427.60171: exiting _queue_task() for managed_node3/set_fact 8454 1726882427.60185: done queuing things up, now waiting for results queue to drain 8454 1726882427.60187: waiting for pending results... 8454 1726882427.60373: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8454 1726882427.60465: in run() - task 0affe814-3a2d-f59f-16b9-0000000003fc 8454 1726882427.60477: variable 'ansible_search_path' from source: unknown 8454 1726882427.60481: variable 'ansible_search_path' from source: unknown 8454 1726882427.60515: calling self._execute() 8454 1726882427.60595: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.60601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.60612: variable 'omit' from source: magic vars 8454 1726882427.60937: variable 'ansible_distribution_major_version' from source: facts 8454 1726882427.60949: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882427.61064: variable 'nm_profile_exists' from source: set_fact 8454 1726882427.61081: Evaluated conditional (nm_profile_exists.rc == 0): True 8454 1726882427.61089: variable 'omit' from source: magic vars 8454 1726882427.61130: variable 'omit' from source: magic vars 8454 1726882427.61157: variable 'omit' from source: magic vars 8454 1726882427.61198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882427.61229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882427.61249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882427.61265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882427.61276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882427.61310: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882427.61314: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.61318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.61404: Set connection var ansible_connection to ssh 8454 1726882427.61412: Set connection var ansible_shell_executable to /bin/sh 8454 1726882427.61420: Set connection var ansible_timeout to 10 8454 1726882427.61423: Set connection var ansible_shell_type to sh 8454 1726882427.61432: Set connection var ansible_pipelining to False 8454 1726882427.61440: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882427.61459: variable 'ansible_shell_executable' from source: unknown 8454 1726882427.61462: variable 'ansible_connection' from source: unknown 8454 1726882427.61466: variable 'ansible_module_compression' from source: unknown 8454 1726882427.61469: variable 'ansible_shell_type' from source: unknown 8454 1726882427.61473: variable 'ansible_shell_executable' from source: unknown 8454 1726882427.61477: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.61484: variable 'ansible_pipelining' from source: unknown 8454 1726882427.61487: variable 'ansible_timeout' from source: unknown 8454 1726882427.61493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.61612: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882427.61620: variable 'omit' from source: magic vars 8454 1726882427.61629: starting attempt loop 8454 1726882427.61633: running the handler 8454 1726882427.61647: handler run complete 8454 1726882427.61657: attempt loop complete, returning result 8454 1726882427.61660: _execute() done 8454 1726882427.61663: dumping result to json 8454 1726882427.61668: done dumping result, returning 8454 1726882427.61676: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-f59f-16b9-0000000003fc] 8454 1726882427.61684: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003fc 8454 1726882427.61774: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003fc 8454 1726882427.61777: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8454 1726882427.61838: no more pending results, returning what we have 8454 1726882427.61842: results queue empty 8454 1726882427.61844: checking for any_errors_fatal 8454 1726882427.61852: done checking for any_errors_fatal 8454 1726882427.61853: checking for max_fail_percentage 8454 1726882427.61855: done checking for max_fail_percentage 8454 1726882427.61856: checking to see if all hosts have failed and the running result is not ok 8454 1726882427.61857: done checking to see if all hosts have failed 8454 1726882427.61858: getting the remaining hosts for this loop 8454 1726882427.61860: done getting the remaining hosts for this loop 8454 1726882427.61863: getting the next task for host managed_node3 8454 1726882427.61873: done getting next task for host managed_node3 8454 1726882427.61876: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8454 1726882427.61880: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882427.61884: getting variables 8454 1726882427.61885: in VariableManager get_vars() 8454 1726882427.61921: Calling all_inventory to load vars for managed_node3 8454 1726882427.61924: Calling groups_inventory to load vars for managed_node3 8454 1726882427.61927: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882427.61944: Calling all_plugins_play to load vars for managed_node3 8454 1726882427.61948: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882427.61951: Calling groups_plugins_play to load vars for managed_node3 8454 1726882427.63214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882427.65336: done with get_vars() 8454 1726882427.65357: done getting variables 8454 1726882427.65406: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882427.65502: variable 'profile' from source: include params 8454 1726882427.65506: variable 'item' from source: include params 8454 1726882427.65557: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:47 -0400 (0:00:00.056) 0:00:25.673 ****** 8454 1726882427.65588: entering _queue_task() for managed_node3/command 8454 1726882427.65809: worker is 1 (out of 1 available) 8454 1726882427.65823: exiting _queue_task() for managed_node3/command 8454 1726882427.65840: done queuing things up, now waiting for results queue to drain 8454 1726882427.65842: waiting for pending results... 8454 1726882427.66019: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 8454 1726882427.66108: in run() - task 0affe814-3a2d-f59f-16b9-0000000003fe 8454 1726882427.66120: variable 'ansible_search_path' from source: unknown 8454 1726882427.66126: variable 'ansible_search_path' from source: unknown 8454 1726882427.66157: calling self._execute() 8454 1726882427.66236: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.66241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.66252: variable 'omit' from source: magic vars 8454 1726882427.66562: variable 'ansible_distribution_major_version' from source: facts 8454 1726882427.66573: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882427.66675: variable 'profile_stat' from source: set_fact 8454 1726882427.66687: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882427.66691: when evaluation is False, skipping this task 8454 1726882427.66694: _execute() done 8454 1726882427.66699: dumping result to json 8454 1726882427.66703: done dumping result, returning 8454 1726882427.66710: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affe814-3a2d-f59f-16b9-0000000003fe] 8454 1726882427.66717: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003fe 8454 1726882427.66811: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003fe 8454 1726882427.66814: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882427.66888: no more pending results, returning what we have 8454 1726882427.66892: results queue empty 8454 1726882427.66893: checking for any_errors_fatal 8454 1726882427.66899: done checking for any_errors_fatal 8454 1726882427.66900: checking for max_fail_percentage 8454 1726882427.66902: done checking for max_fail_percentage 8454 1726882427.66903: checking to see if all hosts have failed and the running result is not ok 8454 1726882427.66904: done checking to see if all hosts have failed 8454 1726882427.66905: getting the remaining hosts for this loop 8454 1726882427.66907: done getting the remaining hosts for this loop 8454 1726882427.66910: getting the next task for host managed_node3 8454 1726882427.66916: done getting next task for host managed_node3 8454 1726882427.66919: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8454 1726882427.66923: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882427.66926: getting variables 8454 1726882427.66928: in VariableManager get_vars() 8454 1726882427.66965: Calling all_inventory to load vars for managed_node3 8454 1726882427.66968: Calling groups_inventory to load vars for managed_node3 8454 1726882427.66971: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882427.66983: Calling all_plugins_play to load vars for managed_node3 8454 1726882427.66985: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882427.66988: Calling groups_plugins_play to load vars for managed_node3 8454 1726882427.69023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882427.72451: done with get_vars() 8454 1726882427.72493: done getting variables 8454 1726882427.72564: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882427.72741: variable 'profile' from source: include params 8454 1726882427.72745: variable 'item' from source: include params 8454 1726882427.72827: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:47 -0400 (0:00:00.072) 0:00:25.746 ****** 8454 1726882427.72866: entering _queue_task() for managed_node3/set_fact 8454 1726882427.73205: worker is 1 (out of 1 available) 8454 1726882427.73218: exiting _queue_task() for managed_node3/set_fact 8454 1726882427.73232: done queuing things up, now waiting for results queue to drain 8454 1726882427.73235: waiting for pending results... 8454 1726882427.73657: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 8454 1726882427.73705: in run() - task 0affe814-3a2d-f59f-16b9-0000000003ff 8454 1726882427.73738: variable 'ansible_search_path' from source: unknown 8454 1726882427.73756: variable 'ansible_search_path' from source: unknown 8454 1726882427.73804: calling self._execute() 8454 1726882427.73916: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.73931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.73952: variable 'omit' from source: magic vars 8454 1726882427.74396: variable 'ansible_distribution_major_version' from source: facts 8454 1726882427.74420: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882427.74582: variable 'profile_stat' from source: set_fact 8454 1726882427.74622: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882427.74625: when evaluation is False, skipping this task 8454 1726882427.74627: _execute() done 8454 1726882427.74630: dumping result to json 8454 1726882427.74631: done dumping result, returning 8454 1726882427.74635: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affe814-3a2d-f59f-16b9-0000000003ff] 8454 1726882427.74648: sending task result for task 0affe814-3a2d-f59f-16b9-0000000003ff 8454 1726882427.74802: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000003ff 8454 1726882427.74805: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882427.74893: no more pending results, returning what we have 8454 1726882427.74898: results queue empty 8454 1726882427.74900: checking for any_errors_fatal 8454 1726882427.74909: done checking for any_errors_fatal 8454 1726882427.74910: checking for max_fail_percentage 8454 1726882427.74913: done checking for max_fail_percentage 8454 1726882427.74914: checking to see if all hosts have failed and the running result is not ok 8454 1726882427.74915: done checking to see if all hosts have failed 8454 1726882427.74916: getting the remaining hosts for this loop 8454 1726882427.74919: done getting the remaining hosts for this loop 8454 1726882427.74923: getting the next task for host managed_node3 8454 1726882427.74933: done getting next task for host managed_node3 8454 1726882427.74939: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8454 1726882427.74944: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882427.74949: getting variables 8454 1726882427.74950: in VariableManager get_vars() 8454 1726882427.74997: Calling all_inventory to load vars for managed_node3 8454 1726882427.75001: Calling groups_inventory to load vars for managed_node3 8454 1726882427.75003: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882427.75018: Calling all_plugins_play to load vars for managed_node3 8454 1726882427.75022: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882427.75025: Calling groups_plugins_play to load vars for managed_node3 8454 1726882427.77438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882427.80366: done with get_vars() 8454 1726882427.80401: done getting variables 8454 1726882427.80469: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882427.80594: variable 'profile' from source: include params 8454 1726882427.80599: variable 'item' from source: include params 8454 1726882427.80670: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:47 -0400 (0:00:00.078) 0:00:25.824 ****** 8454 1726882427.80707: entering _queue_task() for managed_node3/command 8454 1726882427.81003: worker is 1 (out of 1 available) 8454 1726882427.81017: exiting _queue_task() for managed_node3/command 8454 1726882427.81032: done queuing things up, now waiting for results queue to drain 8454 1726882427.81033: waiting for pending results... 8454 1726882427.81366: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 8454 1726882427.81572: in run() - task 0affe814-3a2d-f59f-16b9-000000000400 8454 1726882427.81577: variable 'ansible_search_path' from source: unknown 8454 1726882427.81583: variable 'ansible_search_path' from source: unknown 8454 1726882427.81585: calling self._execute() 8454 1726882427.81648: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.81661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.81686: variable 'omit' from source: magic vars 8454 1726882427.82116: variable 'ansible_distribution_major_version' from source: facts 8454 1726882427.82136: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882427.82300: variable 'profile_stat' from source: set_fact 8454 1726882427.82321: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882427.82334: when evaluation is False, skipping this task 8454 1726882427.82344: _execute() done 8454 1726882427.82351: dumping result to json 8454 1726882427.82359: done dumping result, returning 8454 1726882427.82369: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affe814-3a2d-f59f-16b9-000000000400] 8454 1726882427.82383: sending task result for task 0affe814-3a2d-f59f-16b9-000000000400 8454 1726882427.82687: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000400 8454 1726882427.82690: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882427.82736: no more pending results, returning what we have 8454 1726882427.82740: results queue empty 8454 1726882427.82741: checking for any_errors_fatal 8454 1726882427.82747: done checking for any_errors_fatal 8454 1726882427.82748: checking for max_fail_percentage 8454 1726882427.82750: done checking for max_fail_percentage 8454 1726882427.82751: checking to see if all hosts have failed and the running result is not ok 8454 1726882427.82752: done checking to see if all hosts have failed 8454 1726882427.82753: getting the remaining hosts for this loop 8454 1726882427.82755: done getting the remaining hosts for this loop 8454 1726882427.82758: getting the next task for host managed_node3 8454 1726882427.82765: done getting next task for host managed_node3 8454 1726882427.82768: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8454 1726882427.82772: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882427.82776: getting variables 8454 1726882427.82779: in VariableManager get_vars() 8454 1726882427.82816: Calling all_inventory to load vars for managed_node3 8454 1726882427.82819: Calling groups_inventory to load vars for managed_node3 8454 1726882427.82822: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882427.82833: Calling all_plugins_play to load vars for managed_node3 8454 1726882427.82838: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882427.82842: Calling groups_plugins_play to load vars for managed_node3 8454 1726882427.85073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882427.89076: done with get_vars() 8454 1726882427.89118: done getting variables 8454 1726882427.89396: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882427.89528: variable 'profile' from source: include params 8454 1726882427.89533: variable 'item' from source: include params 8454 1726882427.89814: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:47 -0400 (0:00:00.091) 0:00:25.916 ****** 8454 1726882427.89855: entering _queue_task() for managed_node3/set_fact 8454 1726882427.90373: worker is 1 (out of 1 available) 8454 1726882427.90388: exiting _queue_task() for managed_node3/set_fact 8454 1726882427.90450: done queuing things up, now waiting for results queue to drain 8454 1726882427.90452: waiting for pending results... 8454 1726882427.90798: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 8454 1726882427.90953: in run() - task 0affe814-3a2d-f59f-16b9-000000000401 8454 1726882427.90972: variable 'ansible_search_path' from source: unknown 8454 1726882427.90976: variable 'ansible_search_path' from source: unknown 8454 1726882427.91014: calling self._execute() 8454 1726882427.91360: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882427.91374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882427.91393: variable 'omit' from source: magic vars 8454 1726882427.91806: variable 'ansible_distribution_major_version' from source: facts 8454 1726882427.91825: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882427.92151: variable 'profile_stat' from source: set_fact 8454 1726882427.92154: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882427.92157: when evaluation is False, skipping this task 8454 1726882427.92160: _execute() done 8454 1726882427.92162: dumping result to json 8454 1726882427.92164: done dumping result, returning 8454 1726882427.92167: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affe814-3a2d-f59f-16b9-000000000401] 8454 1726882427.92170: sending task result for task 0affe814-3a2d-f59f-16b9-000000000401 8454 1726882427.92485: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000401 8454 1726882427.92491: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882427.92548: no more pending results, returning what we have 8454 1726882427.92552: results queue empty 8454 1726882427.92552: checking for any_errors_fatal 8454 1726882427.92559: done checking for any_errors_fatal 8454 1726882427.92560: checking for max_fail_percentage 8454 1726882427.92561: done checking for max_fail_percentage 8454 1726882427.92563: checking to see if all hosts have failed and the running result is not ok 8454 1726882427.92563: done checking to see if all hosts have failed 8454 1726882427.92564: getting the remaining hosts for this loop 8454 1726882427.92566: done getting the remaining hosts for this loop 8454 1726882427.92570: getting the next task for host managed_node3 8454 1726882427.92580: done getting next task for host managed_node3 8454 1726882427.92584: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8454 1726882427.92588: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882427.92641: getting variables 8454 1726882427.92643: in VariableManager get_vars() 8454 1726882427.92682: Calling all_inventory to load vars for managed_node3 8454 1726882427.92685: Calling groups_inventory to load vars for managed_node3 8454 1726882427.92687: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882427.92812: Calling all_plugins_play to load vars for managed_node3 8454 1726882427.92819: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882427.92825: Calling groups_plugins_play to load vars for managed_node3 8454 1726882427.95424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882427.99095: done with get_vars() 8454 1726882427.99136: done getting variables 8454 1726882427.99244: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882427.99416: variable 'profile' from source: include params 8454 1726882427.99421: variable 'item' from source: include params 8454 1726882427.99496: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:47 -0400 (0:00:00.096) 0:00:26.013 ****** 8454 1726882427.99551: entering _queue_task() for managed_node3/assert 8454 1726882427.99973: worker is 1 (out of 1 available) 8454 1726882427.99987: exiting _queue_task() for managed_node3/assert 8454 1726882428.00001: done queuing things up, now waiting for results queue to drain 8454 1726882428.00003: waiting for pending results... 8454 1726882428.00309: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 8454 1726882428.00743: in run() - task 0affe814-3a2d-f59f-16b9-000000000267 8454 1726882428.00940: variable 'ansible_search_path' from source: unknown 8454 1726882428.00945: variable 'ansible_search_path' from source: unknown 8454 1726882428.00949: calling self._execute() 8454 1726882428.00953: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.00956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.01340: variable 'omit' from source: magic vars 8454 1726882428.01846: variable 'ansible_distribution_major_version' from source: facts 8454 1726882428.01956: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882428.01969: variable 'omit' from source: magic vars 8454 1726882428.02025: variable 'omit' from source: magic vars 8454 1726882428.02215: variable 'profile' from source: include params 8454 1726882428.02244: variable 'item' from source: include params 8454 1726882428.02348: variable 'item' from source: include params 8454 1726882428.02386: variable 'omit' from source: magic vars 8454 1726882428.02455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882428.02753: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882428.02967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882428.03011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.03029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.03074: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882428.03087: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.03097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.03225: Set connection var ansible_connection to ssh 8454 1726882428.03356: Set connection var ansible_shell_executable to /bin/sh 8454 1726882428.03369: Set connection var ansible_timeout to 10 8454 1726882428.03376: Set connection var ansible_shell_type to sh 8454 1726882428.03395: Set connection var ansible_pipelining to False 8454 1726882428.03640: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882428.03644: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.03647: variable 'ansible_connection' from source: unknown 8454 1726882428.03649: variable 'ansible_module_compression' from source: unknown 8454 1726882428.03651: variable 'ansible_shell_type' from source: unknown 8454 1726882428.03654: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.03656: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.03658: variable 'ansible_pipelining' from source: unknown 8454 1726882428.03660: variable 'ansible_timeout' from source: unknown 8454 1726882428.03662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.03908: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882428.04046: variable 'omit' from source: magic vars 8454 1726882428.04072: starting attempt loop 8454 1726882428.04084: running the handler 8454 1726882428.04231: variable 'lsr_net_profile_exists' from source: set_fact 8454 1726882428.04290: Evaluated conditional (lsr_net_profile_exists): True 8454 1726882428.04303: handler run complete 8454 1726882428.04364: attempt loop complete, returning result 8454 1726882428.04373: _execute() done 8454 1726882428.04382: dumping result to json 8454 1726882428.04391: done dumping result, returning 8454 1726882428.04403: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [0affe814-3a2d-f59f-16b9-000000000267] 8454 1726882428.04414: sending task result for task 0affe814-3a2d-f59f-16b9-000000000267 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882428.04576: no more pending results, returning what we have 8454 1726882428.04583: results queue empty 8454 1726882428.04584: checking for any_errors_fatal 8454 1726882428.04592: done checking for any_errors_fatal 8454 1726882428.04593: checking for max_fail_percentage 8454 1726882428.04595: done checking for max_fail_percentage 8454 1726882428.04596: checking to see if all hosts have failed and the running result is not ok 8454 1726882428.04597: done checking to see if all hosts have failed 8454 1726882428.04598: getting the remaining hosts for this loop 8454 1726882428.04600: done getting the remaining hosts for this loop 8454 1726882428.04605: getting the next task for host managed_node3 8454 1726882428.04615: done getting next task for host managed_node3 8454 1726882428.04619: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8454 1726882428.04623: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882428.04627: getting variables 8454 1726882428.04629: in VariableManager get_vars() 8454 1726882428.04675: Calling all_inventory to load vars for managed_node3 8454 1726882428.04681: Calling groups_inventory to load vars for managed_node3 8454 1726882428.04684: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882428.04698: Calling all_plugins_play to load vars for managed_node3 8454 1726882428.04701: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882428.04706: Calling groups_plugins_play to load vars for managed_node3 8454 1726882428.05226: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000267 8454 1726882428.05230: WORKER PROCESS EXITING 8454 1726882428.07351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882428.10298: done with get_vars() 8454 1726882428.10338: done getting variables 8454 1726882428.10408: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882428.10547: variable 'profile' from source: include params 8454 1726882428.10552: variable 'item' from source: include params 8454 1726882428.10624: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:48 -0400 (0:00:00.111) 0:00:26.124 ****** 8454 1726882428.10671: entering _queue_task() for managed_node3/assert 8454 1726882428.11021: worker is 1 (out of 1 available) 8454 1726882428.11236: exiting _queue_task() for managed_node3/assert 8454 1726882428.11251: done queuing things up, now waiting for results queue to drain 8454 1726882428.11253: waiting for pending results... 8454 1726882428.11384: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 8454 1726882428.11486: in run() - task 0affe814-3a2d-f59f-16b9-000000000268 8454 1726882428.11510: variable 'ansible_search_path' from source: unknown 8454 1726882428.11588: variable 'ansible_search_path' from source: unknown 8454 1726882428.11593: calling self._execute() 8454 1726882428.11683: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.11703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.11722: variable 'omit' from source: magic vars 8454 1726882428.12156: variable 'ansible_distribution_major_version' from source: facts 8454 1726882428.12176: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882428.12190: variable 'omit' from source: magic vars 8454 1726882428.12245: variable 'omit' from source: magic vars 8454 1726882428.12372: variable 'profile' from source: include params 8454 1726882428.12383: variable 'item' from source: include params 8454 1726882428.12466: variable 'item' from source: include params 8454 1726882428.12568: variable 'omit' from source: magic vars 8454 1726882428.12572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882428.12599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882428.12628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882428.12657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.12678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.12716: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882428.12725: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.12735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.12864: Set connection var ansible_connection to ssh 8454 1726882428.12882: Set connection var ansible_shell_executable to /bin/sh 8454 1726882428.12899: Set connection var ansible_timeout to 10 8454 1726882428.12906: Set connection var ansible_shell_type to sh 8454 1726882428.12921: Set connection var ansible_pipelining to False 8454 1726882428.13001: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882428.13005: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.13007: variable 'ansible_connection' from source: unknown 8454 1726882428.13009: variable 'ansible_module_compression' from source: unknown 8454 1726882428.13012: variable 'ansible_shell_type' from source: unknown 8454 1726882428.13014: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.13016: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.13018: variable 'ansible_pipelining' from source: unknown 8454 1726882428.13020: variable 'ansible_timeout' from source: unknown 8454 1726882428.13022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.13179: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882428.13199: variable 'omit' from source: magic vars 8454 1726882428.13210: starting attempt loop 8454 1726882428.13222: running the handler 8454 1726882428.13363: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8454 1726882428.13374: Evaluated conditional (lsr_net_profile_ansible_managed): True 8454 1726882428.13385: handler run complete 8454 1726882428.13408: attempt loop complete, returning result 8454 1726882428.13416: _execute() done 8454 1726882428.13435: dumping result to json 8454 1726882428.13438: done dumping result, returning 8454 1726882428.13448: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affe814-3a2d-f59f-16b9-000000000268] 8454 1726882428.13541: sending task result for task 0affe814-3a2d-f59f-16b9-000000000268 8454 1726882428.13610: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000268 8454 1726882428.13613: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882428.13700: no more pending results, returning what we have 8454 1726882428.13704: results queue empty 8454 1726882428.13705: checking for any_errors_fatal 8454 1726882428.13713: done checking for any_errors_fatal 8454 1726882428.13714: checking for max_fail_percentage 8454 1726882428.13716: done checking for max_fail_percentage 8454 1726882428.13717: checking to see if all hosts have failed and the running result is not ok 8454 1726882428.13718: done checking to see if all hosts have failed 8454 1726882428.13719: getting the remaining hosts for this loop 8454 1726882428.13721: done getting the remaining hosts for this loop 8454 1726882428.13726: getting the next task for host managed_node3 8454 1726882428.13736: done getting next task for host managed_node3 8454 1726882428.13739: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8454 1726882428.13743: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882428.13747: getting variables 8454 1726882428.13748: in VariableManager get_vars() 8454 1726882428.13795: Calling all_inventory to load vars for managed_node3 8454 1726882428.13799: Calling groups_inventory to load vars for managed_node3 8454 1726882428.13802: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882428.13816: Calling all_plugins_play to load vars for managed_node3 8454 1726882428.13819: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882428.13823: Calling groups_plugins_play to load vars for managed_node3 8454 1726882428.16303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882428.19068: done with get_vars() 8454 1726882428.19105: done getting variables 8454 1726882428.19173: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882428.19300: variable 'profile' from source: include params 8454 1726882428.19305: variable 'item' from source: include params 8454 1726882428.19378: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:48 -0400 (0:00:00.087) 0:00:26.211 ****** 8454 1726882428.19417: entering _queue_task() for managed_node3/assert 8454 1726882428.19748: worker is 1 (out of 1 available) 8454 1726882428.19761: exiting _queue_task() for managed_node3/assert 8454 1726882428.19777: done queuing things up, now waiting for results queue to drain 8454 1726882428.19778: waiting for pending results... 8454 1726882428.20254: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 8454 1726882428.20259: in run() - task 0affe814-3a2d-f59f-16b9-000000000269 8454 1726882428.20263: variable 'ansible_search_path' from source: unknown 8454 1726882428.20267: variable 'ansible_search_path' from source: unknown 8454 1726882428.20279: calling self._execute() 8454 1726882428.20388: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.20403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.20418: variable 'omit' from source: magic vars 8454 1726882428.20851: variable 'ansible_distribution_major_version' from source: facts 8454 1726882428.20870: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882428.20884: variable 'omit' from source: magic vars 8454 1726882428.20943: variable 'omit' from source: magic vars 8454 1726882428.21071: variable 'profile' from source: include params 8454 1726882428.21081: variable 'item' from source: include params 8454 1726882428.21168: variable 'item' from source: include params 8454 1726882428.21194: variable 'omit' from source: magic vars 8454 1726882428.21248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882428.21299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882428.21326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882428.21354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.21439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.21443: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882428.21446: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.21448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.21559: Set connection var ansible_connection to ssh 8454 1726882428.21578: Set connection var ansible_shell_executable to /bin/sh 8454 1726882428.21594: Set connection var ansible_timeout to 10 8454 1726882428.21602: Set connection var ansible_shell_type to sh 8454 1726882428.21618: Set connection var ansible_pipelining to False 8454 1726882428.21630: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882428.21662: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.21670: variable 'ansible_connection' from source: unknown 8454 1726882428.21678: variable 'ansible_module_compression' from source: unknown 8454 1726882428.21685: variable 'ansible_shell_type' from source: unknown 8454 1726882428.21743: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.21746: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.21749: variable 'ansible_pipelining' from source: unknown 8454 1726882428.21752: variable 'ansible_timeout' from source: unknown 8454 1726882428.21754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.21895: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882428.21913: variable 'omit' from source: magic vars 8454 1726882428.21924: starting attempt loop 8454 1726882428.21931: running the handler 8454 1726882428.22075: variable 'lsr_net_profile_fingerprint' from source: set_fact 8454 1726882428.22087: Evaluated conditional (lsr_net_profile_fingerprint): True 8454 1726882428.22099: handler run complete 8454 1726882428.22139: attempt loop complete, returning result 8454 1726882428.22142: _execute() done 8454 1726882428.22145: dumping result to json 8454 1726882428.22147: done dumping result, returning 8454 1726882428.22176: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [0affe814-3a2d-f59f-16b9-000000000269] 8454 1726882428.22179: sending task result for task 0affe814-3a2d-f59f-16b9-000000000269 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882428.22333: no more pending results, returning what we have 8454 1726882428.22338: results queue empty 8454 1726882428.22340: checking for any_errors_fatal 8454 1726882428.22347: done checking for any_errors_fatal 8454 1726882428.22348: checking for max_fail_percentage 8454 1726882428.22351: done checking for max_fail_percentage 8454 1726882428.22352: checking to see if all hosts have failed and the running result is not ok 8454 1726882428.22353: done checking to see if all hosts have failed 8454 1726882428.22354: getting the remaining hosts for this loop 8454 1726882428.22356: done getting the remaining hosts for this loop 8454 1726882428.22361: getting the next task for host managed_node3 8454 1726882428.22373: done getting next task for host managed_node3 8454 1726882428.22378: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8454 1726882428.22382: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882428.22386: getting variables 8454 1726882428.22388: in VariableManager get_vars() 8454 1726882428.22433: Calling all_inventory to load vars for managed_node3 8454 1726882428.22540: Calling groups_inventory to load vars for managed_node3 8454 1726882428.22544: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882428.22551: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000269 8454 1726882428.22554: WORKER PROCESS EXITING 8454 1726882428.22566: Calling all_plugins_play to load vars for managed_node3 8454 1726882428.22570: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882428.22574: Calling groups_plugins_play to load vars for managed_node3 8454 1726882428.24815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882428.27825: done with get_vars() 8454 1726882428.28023: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:48 -0400 (0:00:00.087) 0:00:26.299 ****** 8454 1726882428.28133: entering _queue_task() for managed_node3/include_tasks 8454 1726882428.28887: worker is 1 (out of 1 available) 8454 1726882428.28899: exiting _queue_task() for managed_node3/include_tasks 8454 1726882428.28911: done queuing things up, now waiting for results queue to drain 8454 1726882428.28913: waiting for pending results... 8454 1726882428.29367: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 8454 1726882428.29518: in run() - task 0affe814-3a2d-f59f-16b9-00000000026d 8454 1726882428.29533: variable 'ansible_search_path' from source: unknown 8454 1726882428.29546: variable 'ansible_search_path' from source: unknown 8454 1726882428.29794: calling self._execute() 8454 1726882428.29798: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.29801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.29804: variable 'omit' from source: magic vars 8454 1726882428.30156: variable 'ansible_distribution_major_version' from source: facts 8454 1726882428.30169: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882428.30177: _execute() done 8454 1726882428.30185: dumping result to json 8454 1726882428.30189: done dumping result, returning 8454 1726882428.30196: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-f59f-16b9-00000000026d] 8454 1726882428.30204: sending task result for task 0affe814-3a2d-f59f-16b9-00000000026d 8454 1726882428.30306: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000026d 8454 1726882428.30309: WORKER PROCESS EXITING 8454 1726882428.30339: no more pending results, returning what we have 8454 1726882428.30344: in VariableManager get_vars() 8454 1726882428.30396: Calling all_inventory to load vars for managed_node3 8454 1726882428.30399: Calling groups_inventory to load vars for managed_node3 8454 1726882428.30402: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882428.30415: Calling all_plugins_play to load vars for managed_node3 8454 1726882428.30418: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882428.30421: Calling groups_plugins_play to load vars for managed_node3 8454 1726882428.39779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882428.42754: done with get_vars() 8454 1726882428.42794: variable 'ansible_search_path' from source: unknown 8454 1726882428.42796: variable 'ansible_search_path' from source: unknown 8454 1726882428.42847: we have included files to process 8454 1726882428.42849: generating all_blocks data 8454 1726882428.42851: done generating all_blocks data 8454 1726882428.42854: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882428.42855: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882428.42858: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8454 1726882428.44000: done processing included file 8454 1726882428.44003: iterating over new_blocks loaded from include file 8454 1726882428.44005: in VariableManager get_vars() 8454 1726882428.44032: done with get_vars() 8454 1726882428.44036: filtering new block on tags 8454 1726882428.44070: done filtering new block on tags 8454 1726882428.44073: in VariableManager get_vars() 8454 1726882428.44097: done with get_vars() 8454 1726882428.44099: filtering new block on tags 8454 1726882428.44128: done filtering new block on tags 8454 1726882428.44131: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 8454 1726882428.44138: extending task lists for all hosts with included blocks 8454 1726882428.44376: done extending task lists 8454 1726882428.44377: done processing included files 8454 1726882428.44378: results queue empty 8454 1726882428.44379: checking for any_errors_fatal 8454 1726882428.44383: done checking for any_errors_fatal 8454 1726882428.44384: checking for max_fail_percentage 8454 1726882428.44385: done checking for max_fail_percentage 8454 1726882428.44386: checking to see if all hosts have failed and the running result is not ok 8454 1726882428.44388: done checking to see if all hosts have failed 8454 1726882428.44389: getting the remaining hosts for this loop 8454 1726882428.44390: done getting the remaining hosts for this loop 8454 1726882428.44393: getting the next task for host managed_node3 8454 1726882428.44398: done getting next task for host managed_node3 8454 1726882428.44400: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8454 1726882428.44404: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882428.44406: getting variables 8454 1726882428.44407: in VariableManager get_vars() 8454 1726882428.44423: Calling all_inventory to load vars for managed_node3 8454 1726882428.44426: Calling groups_inventory to load vars for managed_node3 8454 1726882428.44429: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882428.44439: Calling all_plugins_play to load vars for managed_node3 8454 1726882428.44442: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882428.44446: Calling groups_plugins_play to load vars for managed_node3 8454 1726882428.46475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882428.50104: done with get_vars() 8454 1726882428.50242: done getting variables 8454 1726882428.50289: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:48 -0400 (0:00:00.221) 0:00:26.520 ****** 8454 1726882428.50322: entering _queue_task() for managed_node3/set_fact 8454 1726882428.51095: worker is 1 (out of 1 available) 8454 1726882428.51109: exiting _queue_task() for managed_node3/set_fact 8454 1726882428.51125: done queuing things up, now waiting for results queue to drain 8454 1726882428.51126: waiting for pending results... 8454 1726882428.51855: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 8454 1726882428.52281: in run() - task 0affe814-3a2d-f59f-16b9-000000000440 8454 1726882428.52441: variable 'ansible_search_path' from source: unknown 8454 1726882428.52445: variable 'ansible_search_path' from source: unknown 8454 1726882428.52448: calling self._execute() 8454 1726882428.52573: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.52583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.52597: variable 'omit' from source: magic vars 8454 1726882428.53777: variable 'ansible_distribution_major_version' from source: facts 8454 1726882428.53831: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882428.53839: variable 'omit' from source: magic vars 8454 1726882428.53902: variable 'omit' from source: magic vars 8454 1726882428.54066: variable 'omit' from source: magic vars 8454 1726882428.54118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882428.54278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882428.54307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882428.54330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.54479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.54483: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882428.54486: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.54489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.54806: Set connection var ansible_connection to ssh 8454 1726882428.54810: Set connection var ansible_shell_executable to /bin/sh 8454 1726882428.54940: Set connection var ansible_timeout to 10 8454 1726882428.54943: Set connection var ansible_shell_type to sh 8454 1726882428.54946: Set connection var ansible_pipelining to False 8454 1726882428.54949: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882428.55023: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.55027: variable 'ansible_connection' from source: unknown 8454 1726882428.55031: variable 'ansible_module_compression' from source: unknown 8454 1726882428.55037: variable 'ansible_shell_type' from source: unknown 8454 1726882428.55040: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.55042: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.55045: variable 'ansible_pipelining' from source: unknown 8454 1726882428.55047: variable 'ansible_timeout' from source: unknown 8454 1726882428.55049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.55559: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882428.55570: variable 'omit' from source: magic vars 8454 1726882428.55627: starting attempt loop 8454 1726882428.55630: running the handler 8454 1726882428.55633: handler run complete 8454 1726882428.55681: attempt loop complete, returning result 8454 1726882428.55684: _execute() done 8454 1726882428.55688: dumping result to json 8454 1726882428.55691: done dumping result, returning 8454 1726882428.55798: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-f59f-16b9-000000000440] 8454 1726882428.55806: sending task result for task 0affe814-3a2d-f59f-16b9-000000000440 8454 1726882428.56015: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000440 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8454 1726882428.56095: no more pending results, returning what we have 8454 1726882428.56099: results queue empty 8454 1726882428.56100: checking for any_errors_fatal 8454 1726882428.56103: done checking for any_errors_fatal 8454 1726882428.56104: checking for max_fail_percentage 8454 1726882428.56106: done checking for max_fail_percentage 8454 1726882428.56107: checking to see if all hosts have failed and the running result is not ok 8454 1726882428.56108: done checking to see if all hosts have failed 8454 1726882428.56108: getting the remaining hosts for this loop 8454 1726882428.56110: done getting the remaining hosts for this loop 8454 1726882428.56114: getting the next task for host managed_node3 8454 1726882428.56122: done getting next task for host managed_node3 8454 1726882428.56127: ^ task is: TASK: Stat profile file 8454 1726882428.56132: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882428.56439: getting variables 8454 1726882428.56442: in VariableManager get_vars() 8454 1726882428.56499: Calling all_inventory to load vars for managed_node3 8454 1726882428.56503: Calling groups_inventory to load vars for managed_node3 8454 1726882428.56506: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882428.56520: Calling all_plugins_play to load vars for managed_node3 8454 1726882428.56524: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882428.56528: Calling groups_plugins_play to load vars for managed_node3 8454 1726882428.56645: WORKER PROCESS EXITING 8454 1726882428.59556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882428.62358: done with get_vars() 8454 1726882428.62383: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:48 -0400 (0:00:00.121) 0:00:26.642 ****** 8454 1726882428.62463: entering _queue_task() for managed_node3/stat 8454 1726882428.62718: worker is 1 (out of 1 available) 8454 1726882428.62731: exiting _queue_task() for managed_node3/stat 8454 1726882428.62747: done queuing things up, now waiting for results queue to drain 8454 1726882428.62749: waiting for pending results... 8454 1726882428.62940: running TaskExecutor() for managed_node3/TASK: Stat profile file 8454 1726882428.63041: in run() - task 0affe814-3a2d-f59f-16b9-000000000441 8454 1726882428.63055: variable 'ansible_search_path' from source: unknown 8454 1726882428.63059: variable 'ansible_search_path' from source: unknown 8454 1726882428.63098: calling self._execute() 8454 1726882428.63182: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.63186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.63199: variable 'omit' from source: magic vars 8454 1726882428.63523: variable 'ansible_distribution_major_version' from source: facts 8454 1726882428.63540: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882428.63546: variable 'omit' from source: magic vars 8454 1726882428.63589: variable 'omit' from source: magic vars 8454 1726882428.63723: variable 'profile' from source: include params 8454 1726882428.63726: variable 'item' from source: include params 8454 1726882428.63848: variable 'item' from source: include params 8454 1726882428.63933: variable 'omit' from source: magic vars 8454 1726882428.63956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882428.64022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882428.64060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882428.64132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.64189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882428.64341: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882428.64345: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.64347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.64675: Set connection var ansible_connection to ssh 8454 1726882428.64818: Set connection var ansible_shell_executable to /bin/sh 8454 1726882428.64821: Set connection var ansible_timeout to 10 8454 1726882428.64824: Set connection var ansible_shell_type to sh 8454 1726882428.64827: Set connection var ansible_pipelining to False 8454 1726882428.64830: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882428.64896: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.64989: variable 'ansible_connection' from source: unknown 8454 1726882428.64992: variable 'ansible_module_compression' from source: unknown 8454 1726882428.64994: variable 'ansible_shell_type' from source: unknown 8454 1726882428.65000: variable 'ansible_shell_executable' from source: unknown 8454 1726882428.65008: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882428.65031: variable 'ansible_pipelining' from source: unknown 8454 1726882428.65080: variable 'ansible_timeout' from source: unknown 8454 1726882428.65084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882428.65515: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882428.65521: variable 'omit' from source: magic vars 8454 1726882428.65524: starting attempt loop 8454 1726882428.65532: running the handler 8454 1726882428.65535: _low_level_execute_command(): starting 8454 1726882428.65625: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882428.66293: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882428.66305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882428.66314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882428.66322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.66349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882428.66353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.66411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882428.66415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882428.66541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882428.68641: stdout chunk (state=3): >>>/root <<< 8454 1726882428.68645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882428.68648: stderr chunk (state=3): >>><<< 8454 1726882428.68654: stdout chunk (state=3): >>><<< 8454 1726882428.68865: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882428.68882: _low_level_execute_command(): starting 8454 1726882428.68899: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987 `" && echo ansible-tmp-1726882428.6886568-9405-208827572673987="` echo /root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987 `" ) && sleep 0' 8454 1726882428.69702: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882428.69709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882428.69725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882428.69740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.69757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882428.69767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882428.69786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.69900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882428.70021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882428.72108: stdout chunk (state=3): >>>ansible-tmp-1726882428.6886568-9405-208827572673987=/root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987 <<< 8454 1726882428.72249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882428.72301: stderr chunk (state=3): >>><<< 8454 1726882428.72304: stdout chunk (state=3): >>><<< 8454 1726882428.72382: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882428.6886568-9405-208827572673987=/root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882428.72386: variable 'ansible_module_compression' from source: unknown 8454 1726882428.72447: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8454 1726882428.72642: variable 'ansible_facts' from source: unknown 8454 1726882428.72645: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/AnsiballZ_stat.py 8454 1726882428.72765: Sending initial data 8454 1726882428.72770: Sent initial data (151 bytes) 8454 1726882428.73321: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882428.73327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882428.73364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882428.73368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.73370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882428.73373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.73440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882428.73444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882428.73559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882428.75268: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 8454 1726882428.75272: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882428.75377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882428.75493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp2nh4yhgq /root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/AnsiballZ_stat.py <<< 8454 1726882428.75497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/AnsiballZ_stat.py" <<< 8454 1726882428.75605: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp2nh4yhgq" to remote "/root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/AnsiballZ_stat.py" <<< 8454 1726882428.76689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882428.76753: stderr chunk (state=3): >>><<< 8454 1726882428.76756: stdout chunk (state=3): >>><<< 8454 1726882428.76779: done transferring module to remote 8454 1726882428.76790: _low_level_execute_command(): starting 8454 1726882428.76793: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/ /root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/AnsiballZ_stat.py && sleep 0' 8454 1726882428.77224: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882428.77267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882428.77270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882428.77273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8454 1726882428.77275: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882428.77283: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.77332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882428.77339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882428.77451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882428.79385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882428.79432: stderr chunk (state=3): >>><<< 8454 1726882428.79440: stdout chunk (state=3): >>><<< 8454 1726882428.79453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882428.79456: _low_level_execute_command(): starting 8454 1726882428.79462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/AnsiballZ_stat.py && sleep 0' 8454 1726882428.79916: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882428.79919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882428.79922: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.79924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882428.79927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882428.79977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882428.79981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882428.80108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882428.97604: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8454 1726882428.99545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882428.99549: stdout chunk (state=3): >>><<< 8454 1726882428.99552: stderr chunk (state=3): >>><<< 8454 1726882428.99555: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882428.99558: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882428.99561: _low_level_execute_command(): starting 8454 1726882428.99564: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882428.6886568-9405-208827572673987/ > /dev/null 2>&1 && sleep 0' 8454 1726882429.01041: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882429.01253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.01293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882429.01397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882429.01417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882429.01561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882429.03718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882429.03729: stdout chunk (state=3): >>><<< 8454 1726882429.03743: stderr chunk (state=3): >>><<< 8454 1726882429.03763: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882429.03775: handler run complete 8454 1726882429.03849: attempt loop complete, returning result 8454 1726882429.04141: _execute() done 8454 1726882429.04145: dumping result to json 8454 1726882429.04147: done dumping result, returning 8454 1726882429.04149: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affe814-3a2d-f59f-16b9-000000000441] 8454 1726882429.04152: sending task result for task 0affe814-3a2d-f59f-16b9-000000000441 8454 1726882429.04230: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000441 8454 1726882429.04233: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8454 1726882429.04319: no more pending results, returning what we have 8454 1726882429.04323: results queue empty 8454 1726882429.04324: checking for any_errors_fatal 8454 1726882429.04331: done checking for any_errors_fatal 8454 1726882429.04332: checking for max_fail_percentage 8454 1726882429.04336: done checking for max_fail_percentage 8454 1726882429.04337: checking to see if all hosts have failed and the running result is not ok 8454 1726882429.04338: done checking to see if all hosts have failed 8454 1726882429.04339: getting the remaining hosts for this loop 8454 1726882429.04342: done getting the remaining hosts for this loop 8454 1726882429.04347: getting the next task for host managed_node3 8454 1726882429.04355: done getting next task for host managed_node3 8454 1726882429.04358: ^ task is: TASK: Set NM profile exist flag based on the profile files 8454 1726882429.04363: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882429.04367: getting variables 8454 1726882429.04369: in VariableManager get_vars() 8454 1726882429.04417: Calling all_inventory to load vars for managed_node3 8454 1726882429.04421: Calling groups_inventory to load vars for managed_node3 8454 1726882429.04424: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882429.04642: Calling all_plugins_play to load vars for managed_node3 8454 1726882429.04647: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882429.04654: Calling groups_plugins_play to load vars for managed_node3 8454 1726882429.09403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882429.12596: done with get_vars() 8454 1726882429.12650: done getting variables 8454 1726882429.12775: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:49 -0400 (0:00:00.503) 0:00:27.145 ****** 8454 1726882429.12813: entering _queue_task() for managed_node3/set_fact 8454 1726882429.13575: worker is 1 (out of 1 available) 8454 1726882429.13704: exiting _queue_task() for managed_node3/set_fact 8454 1726882429.13722: done queuing things up, now waiting for results queue to drain 8454 1726882429.13724: waiting for pending results... 8454 1726882429.13962: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 8454 1726882429.14159: in run() - task 0affe814-3a2d-f59f-16b9-000000000442 8454 1726882429.14165: variable 'ansible_search_path' from source: unknown 8454 1726882429.14169: variable 'ansible_search_path' from source: unknown 8454 1726882429.14213: calling self._execute() 8454 1726882429.14308: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882429.14315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882429.14325: variable 'omit' from source: magic vars 8454 1726882429.14670: variable 'ansible_distribution_major_version' from source: facts 8454 1726882429.14685: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882429.14792: variable 'profile_stat' from source: set_fact 8454 1726882429.14808: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882429.14811: when evaluation is False, skipping this task 8454 1726882429.14815: _execute() done 8454 1726882429.14817: dumping result to json 8454 1726882429.14820: done dumping result, returning 8454 1726882429.14828: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-f59f-16b9-000000000442] 8454 1726882429.14835: sending task result for task 0affe814-3a2d-f59f-16b9-000000000442 8454 1726882429.14931: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000442 8454 1726882429.14936: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882429.14991: no more pending results, returning what we have 8454 1726882429.14996: results queue empty 8454 1726882429.14997: checking for any_errors_fatal 8454 1726882429.15009: done checking for any_errors_fatal 8454 1726882429.15010: checking for max_fail_percentage 8454 1726882429.15012: done checking for max_fail_percentage 8454 1726882429.15014: checking to see if all hosts have failed and the running result is not ok 8454 1726882429.15015: done checking to see if all hosts have failed 8454 1726882429.15017: getting the remaining hosts for this loop 8454 1726882429.15019: done getting the remaining hosts for this loop 8454 1726882429.15023: getting the next task for host managed_node3 8454 1726882429.15032: done getting next task for host managed_node3 8454 1726882429.15037: ^ task is: TASK: Get NM profile info 8454 1726882429.15042: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882429.15048: getting variables 8454 1726882429.15050: in VariableManager get_vars() 8454 1726882429.15092: Calling all_inventory to load vars for managed_node3 8454 1726882429.15095: Calling groups_inventory to load vars for managed_node3 8454 1726882429.15098: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882429.15110: Calling all_plugins_play to load vars for managed_node3 8454 1726882429.15113: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882429.15116: Calling groups_plugins_play to load vars for managed_node3 8454 1726882429.17748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882429.19655: done with get_vars() 8454 1726882429.19694: done getting variables 8454 1726882429.19767: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:49 -0400 (0:00:00.069) 0:00:27.215 ****** 8454 1726882429.19809: entering _queue_task() for managed_node3/shell 8454 1726882429.20158: worker is 1 (out of 1 available) 8454 1726882429.20173: exiting _queue_task() for managed_node3/shell 8454 1726882429.20187: done queuing things up, now waiting for results queue to drain 8454 1726882429.20189: waiting for pending results... 8454 1726882429.20571: running TaskExecutor() for managed_node3/TASK: Get NM profile info 8454 1726882429.20840: in run() - task 0affe814-3a2d-f59f-16b9-000000000443 8454 1726882429.20844: variable 'ansible_search_path' from source: unknown 8454 1726882429.20847: variable 'ansible_search_path' from source: unknown 8454 1726882429.20850: calling self._execute() 8454 1726882429.20857: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882429.20875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882429.20898: variable 'omit' from source: magic vars 8454 1726882429.21941: variable 'ansible_distribution_major_version' from source: facts 8454 1726882429.21945: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882429.21947: variable 'omit' from source: magic vars 8454 1726882429.22085: variable 'omit' from source: magic vars 8454 1726882429.22245: variable 'profile' from source: include params 8454 1726882429.22250: variable 'item' from source: include params 8454 1726882429.22323: variable 'item' from source: include params 8454 1726882429.22343: variable 'omit' from source: magic vars 8454 1726882429.22385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882429.22418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882429.22438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882429.22464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882429.22476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882429.22505: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882429.22510: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882429.22513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882429.22599: Set connection var ansible_connection to ssh 8454 1726882429.22609: Set connection var ansible_shell_executable to /bin/sh 8454 1726882429.22615: Set connection var ansible_timeout to 10 8454 1726882429.22618: Set connection var ansible_shell_type to sh 8454 1726882429.22628: Set connection var ansible_pipelining to False 8454 1726882429.22636: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882429.22656: variable 'ansible_shell_executable' from source: unknown 8454 1726882429.22661: variable 'ansible_connection' from source: unknown 8454 1726882429.22664: variable 'ansible_module_compression' from source: unknown 8454 1726882429.22666: variable 'ansible_shell_type' from source: unknown 8454 1726882429.22669: variable 'ansible_shell_executable' from source: unknown 8454 1726882429.22674: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882429.22682: variable 'ansible_pipelining' from source: unknown 8454 1726882429.22685: variable 'ansible_timeout' from source: unknown 8454 1726882429.22687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882429.22807: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882429.22818: variable 'omit' from source: magic vars 8454 1726882429.22823: starting attempt loop 8454 1726882429.22827: running the handler 8454 1726882429.22842: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882429.22860: _low_level_execute_command(): starting 8454 1726882429.22867: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882429.23386: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882429.23391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.23395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882429.23398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.23446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882429.23452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882429.23572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882429.25418: stdout chunk (state=3): >>>/root <<< 8454 1726882429.25642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882429.25645: stdout chunk (state=3): >>><<< 8454 1726882429.25648: stderr chunk (state=3): >>><<< 8454 1726882429.25652: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882429.25656: _low_level_execute_command(): starting 8454 1726882429.25662: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713 `" && echo ansible-tmp-1726882429.2564178-9433-109486447994713="` echo /root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713 `" ) && sleep 0' 8454 1726882429.26240: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882429.26250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882429.26269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882429.26277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882429.26309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.26312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882429.26315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.26371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882429.26374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882429.26496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882429.28638: stdout chunk (state=3): >>>ansible-tmp-1726882429.2564178-9433-109486447994713=/root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713 <<< 8454 1726882429.28764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882429.28815: stderr chunk (state=3): >>><<< 8454 1726882429.28818: stdout chunk (state=3): >>><<< 8454 1726882429.28832: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882429.2564178-9433-109486447994713=/root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882429.28861: variable 'ansible_module_compression' from source: unknown 8454 1726882429.28904: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882429.28943: variable 'ansible_facts' from source: unknown 8454 1726882429.29000: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/AnsiballZ_command.py 8454 1726882429.29107: Sending initial data 8454 1726882429.29110: Sent initial data (154 bytes) 8454 1726882429.29521: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882429.29554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882429.29559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.29561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882429.29564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.29619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882429.29626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882429.29740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882429.31438: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 8454 1726882429.31443: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882429.31548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882429.31663: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpr1d3sile /root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/AnsiballZ_command.py <<< 8454 1726882429.31668: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/AnsiballZ_command.py" <<< 8454 1726882429.31773: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpr1d3sile" to remote "/root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/AnsiballZ_command.py" <<< 8454 1726882429.32839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882429.32895: stderr chunk (state=3): >>><<< 8454 1726882429.32898: stdout chunk (state=3): >>><<< 8454 1726882429.32918: done transferring module to remote 8454 1726882429.32928: _low_level_execute_command(): starting 8454 1726882429.32931: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/ /root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/AnsiballZ_command.py && sleep 0' 8454 1726882429.33360: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882429.33363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.33370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882429.33373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.33422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882429.33426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882429.33543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882429.35463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882429.35504: stderr chunk (state=3): >>><<< 8454 1726882429.35508: stdout chunk (state=3): >>><<< 8454 1726882429.35522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882429.35529: _low_level_execute_command(): starting 8454 1726882429.35532: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/AnsiballZ_command.py && sleep 0' 8454 1726882429.35930: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882429.35970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882429.35974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.35976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882429.35981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882429.36026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882429.36033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882429.36154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882429.56242: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:49.533783", "end": "2024-09-20 21:33:49.557326", "delta": "0:00:00.023543", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882429.57737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882429.57741: stdout chunk (state=3): >>><<< 8454 1726882429.57744: stderr chunk (state=3): >>><<< 8454 1726882429.57763: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:49.533783", "end": "2024-09-20 21:33:49.557326", "delta": "0:00:00.023543", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882429.57822: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882429.58140: _low_level_execute_command(): starting 8454 1726882429.58144: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882429.2564178-9433-109486447994713/ > /dev/null 2>&1 && sleep 0' 8454 1726882429.59099: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882429.59470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882429.59488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882429.59509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882429.59705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882429.61765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882429.61769: stdout chunk (state=3): >>><<< 8454 1726882429.61772: stderr chunk (state=3): >>><<< 8454 1726882429.61793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882429.61808: handler run complete 8454 1726882429.61847: Evaluated conditional (False): False 8454 1726882429.61869: attempt loop complete, returning result 8454 1726882429.62140: _execute() done 8454 1726882429.62143: dumping result to json 8454 1726882429.62146: done dumping result, returning 8454 1726882429.62148: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affe814-3a2d-f59f-16b9-000000000443] 8454 1726882429.62150: sending task result for task 0affe814-3a2d-f59f-16b9-000000000443 8454 1726882429.62236: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000443 8454 1726882429.62240: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.023543", "end": "2024-09-20 21:33:49.557326", "rc": 0, "start": "2024-09-20 21:33:49.533783" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 8454 1726882429.62321: no more pending results, returning what we have 8454 1726882429.62325: results queue empty 8454 1726882429.62326: checking for any_errors_fatal 8454 1726882429.62333: done checking for any_errors_fatal 8454 1726882429.62337: checking for max_fail_percentage 8454 1726882429.62339: done checking for max_fail_percentage 8454 1726882429.62341: checking to see if all hosts have failed and the running result is not ok 8454 1726882429.62341: done checking to see if all hosts have failed 8454 1726882429.62342: getting the remaining hosts for this loop 8454 1726882429.62344: done getting the remaining hosts for this loop 8454 1726882429.62348: getting the next task for host managed_node3 8454 1726882429.62356: done getting next task for host managed_node3 8454 1726882429.62359: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8454 1726882429.62364: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882429.62369: getting variables 8454 1726882429.62370: in VariableManager get_vars() 8454 1726882429.62418: Calling all_inventory to load vars for managed_node3 8454 1726882429.62422: Calling groups_inventory to load vars for managed_node3 8454 1726882429.62424: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882429.62591: Calling all_plugins_play to load vars for managed_node3 8454 1726882429.62597: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882429.62602: Calling groups_plugins_play to load vars for managed_node3 8454 1726882429.68342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882429.74547: done with get_vars() 8454 1726882429.74594: done getting variables 8454 1726882429.74876: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:49 -0400 (0:00:00.551) 0:00:27.766 ****** 8454 1726882429.74918: entering _queue_task() for managed_node3/set_fact 8454 1726882429.75672: worker is 1 (out of 1 available) 8454 1726882429.75683: exiting _queue_task() for managed_node3/set_fact 8454 1726882429.75697: done queuing things up, now waiting for results queue to drain 8454 1726882429.75698: waiting for pending results... 8454 1726882429.76157: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8454 1726882429.76540: in run() - task 0affe814-3a2d-f59f-16b9-000000000444 8454 1726882429.76544: variable 'ansible_search_path' from source: unknown 8454 1726882429.76547: variable 'ansible_search_path' from source: unknown 8454 1726882429.76550: calling self._execute() 8454 1726882429.76552: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882429.76940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882429.76944: variable 'omit' from source: magic vars 8454 1726882429.77581: variable 'ansible_distribution_major_version' from source: facts 8454 1726882429.77603: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882429.77985: variable 'nm_profile_exists' from source: set_fact 8454 1726882429.78015: Evaluated conditional (nm_profile_exists.rc == 0): True 8454 1726882429.78028: variable 'omit' from source: magic vars 8454 1726882429.78102: variable 'omit' from source: magic vars 8454 1726882429.78543: variable 'omit' from source: magic vars 8454 1726882429.78547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882429.78550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882429.78552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882429.78555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882429.78557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882429.78755: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882429.78766: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882429.78774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882429.78905: Set connection var ansible_connection to ssh 8454 1726882429.78924: Set connection var ansible_shell_executable to /bin/sh 8454 1726882429.79339: Set connection var ansible_timeout to 10 8454 1726882429.79343: Set connection var ansible_shell_type to sh 8454 1726882429.79345: Set connection var ansible_pipelining to False 8454 1726882429.79347: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882429.79350: variable 'ansible_shell_executable' from source: unknown 8454 1726882429.79352: variable 'ansible_connection' from source: unknown 8454 1726882429.79354: variable 'ansible_module_compression' from source: unknown 8454 1726882429.79356: variable 'ansible_shell_type' from source: unknown 8454 1726882429.79358: variable 'ansible_shell_executable' from source: unknown 8454 1726882429.79360: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882429.79362: variable 'ansible_pipelining' from source: unknown 8454 1726882429.79364: variable 'ansible_timeout' from source: unknown 8454 1726882429.79366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882429.79597: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882429.79615: variable 'omit' from source: magic vars 8454 1726882429.79625: starting attempt loop 8454 1726882429.79633: running the handler 8454 1726882429.79653: handler run complete 8454 1726882429.79671: attempt loop complete, returning result 8454 1726882429.79680: _execute() done 8454 1726882429.79689: dumping result to json 8454 1726882429.79697: done dumping result, returning 8454 1726882429.79710: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-f59f-16b9-000000000444] 8454 1726882429.79720: sending task result for task 0affe814-3a2d-f59f-16b9-000000000444 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8454 1726882429.79911: no more pending results, returning what we have 8454 1726882429.79915: results queue empty 8454 1726882429.79916: checking for any_errors_fatal 8454 1726882429.79927: done checking for any_errors_fatal 8454 1726882429.79928: checking for max_fail_percentage 8454 1726882429.79930: done checking for max_fail_percentage 8454 1726882429.79932: checking to see if all hosts have failed and the running result is not ok 8454 1726882429.79933: done checking to see if all hosts have failed 8454 1726882429.79936: getting the remaining hosts for this loop 8454 1726882429.79938: done getting the remaining hosts for this loop 8454 1726882429.79942: getting the next task for host managed_node3 8454 1726882429.79957: done getting next task for host managed_node3 8454 1726882429.79960: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8454 1726882429.79966: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882429.79970: getting variables 8454 1726882429.79972: in VariableManager get_vars() 8454 1726882429.80020: Calling all_inventory to load vars for managed_node3 8454 1726882429.80024: Calling groups_inventory to load vars for managed_node3 8454 1726882429.80027: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882429.80342: Calling all_plugins_play to load vars for managed_node3 8454 1726882429.80347: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882429.80353: Calling groups_plugins_play to load vars for managed_node3 8454 1726882429.80876: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000444 8454 1726882429.80883: WORKER PROCESS EXITING 8454 1726882429.84825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882429.88750: done with get_vars() 8454 1726882429.88792: done getting variables 8454 1726882429.89071: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882429.89206: variable 'profile' from source: include params 8454 1726882429.89211: variable 'item' from source: include params 8454 1726882429.89489: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:49 -0400 (0:00:00.146) 0:00:27.913 ****** 8454 1726882429.89532: entering _queue_task() for managed_node3/command 8454 1726882429.90280: worker is 1 (out of 1 available) 8454 1726882429.90293: exiting _queue_task() for managed_node3/command 8454 1726882429.90307: done queuing things up, now waiting for results queue to drain 8454 1726882429.90309: waiting for pending results... 8454 1726882429.90652: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 8454 1726882429.91018: in run() - task 0affe814-3a2d-f59f-16b9-000000000446 8454 1726882429.91042: variable 'ansible_search_path' from source: unknown 8454 1726882429.91050: variable 'ansible_search_path' from source: unknown 8454 1726882429.91099: calling self._execute() 8454 1726882429.91443: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882429.91457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882429.91474: variable 'omit' from source: magic vars 8454 1726882429.92302: variable 'ansible_distribution_major_version' from source: facts 8454 1726882429.92322: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882429.92686: variable 'profile_stat' from source: set_fact 8454 1726882429.92707: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882429.92716: when evaluation is False, skipping this task 8454 1726882429.92723: _execute() done 8454 1726882429.92731: dumping result to json 8454 1726882429.92741: done dumping result, returning 8454 1726882429.92752: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affe814-3a2d-f59f-16b9-000000000446] 8454 1726882429.92762: sending task result for task 0affe814-3a2d-f59f-16b9-000000000446 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882429.92936: no more pending results, returning what we have 8454 1726882429.92941: results queue empty 8454 1726882429.92942: checking for any_errors_fatal 8454 1726882429.92949: done checking for any_errors_fatal 8454 1726882429.92950: checking for max_fail_percentage 8454 1726882429.92953: done checking for max_fail_percentage 8454 1726882429.92954: checking to see if all hosts have failed and the running result is not ok 8454 1726882429.92955: done checking to see if all hosts have failed 8454 1726882429.92956: getting the remaining hosts for this loop 8454 1726882429.92958: done getting the remaining hosts for this loop 8454 1726882429.92963: getting the next task for host managed_node3 8454 1726882429.92972: done getting next task for host managed_node3 8454 1726882429.92975: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8454 1726882429.92981: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882429.92987: getting variables 8454 1726882429.92988: in VariableManager get_vars() 8454 1726882429.93239: Calling all_inventory to load vars for managed_node3 8454 1726882429.93243: Calling groups_inventory to load vars for managed_node3 8454 1726882429.93247: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882429.93262: Calling all_plugins_play to load vars for managed_node3 8454 1726882429.93265: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882429.93270: Calling groups_plugins_play to load vars for managed_node3 8454 1726882429.94042: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000446 8454 1726882429.94047: WORKER PROCESS EXITING 8454 1726882429.97799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882430.01498: done with get_vars() 8454 1726882430.01553: done getting variables 8454 1726882430.01637: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882430.01794: variable 'profile' from source: include params 8454 1726882430.01799: variable 'item' from source: include params 8454 1726882430.01894: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:50 -0400 (0:00:00.124) 0:00:28.037 ****** 8454 1726882430.01945: entering _queue_task() for managed_node3/set_fact 8454 1726882430.02406: worker is 1 (out of 1 available) 8454 1726882430.02425: exiting _queue_task() for managed_node3/set_fact 8454 1726882430.02440: done queuing things up, now waiting for results queue to drain 8454 1726882430.02442: waiting for pending results... 8454 1726882430.02714: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 8454 1726882430.02945: in run() - task 0affe814-3a2d-f59f-16b9-000000000447 8454 1726882430.02950: variable 'ansible_search_path' from source: unknown 8454 1726882430.02953: variable 'ansible_search_path' from source: unknown 8454 1726882430.02964: calling self._execute() 8454 1726882430.03081: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.03094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.03110: variable 'omit' from source: magic vars 8454 1726882430.03554: variable 'ansible_distribution_major_version' from source: facts 8454 1726882430.03574: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882430.03816: variable 'profile_stat' from source: set_fact 8454 1726882430.03820: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882430.03823: when evaluation is False, skipping this task 8454 1726882430.03825: _execute() done 8454 1726882430.03828: dumping result to json 8454 1726882430.03830: done dumping result, returning 8454 1726882430.03833: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affe814-3a2d-f59f-16b9-000000000447] 8454 1726882430.03837: sending task result for task 0affe814-3a2d-f59f-16b9-000000000447 8454 1726882430.04150: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000447 8454 1726882430.04154: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882430.04201: no more pending results, returning what we have 8454 1726882430.04204: results queue empty 8454 1726882430.04206: checking for any_errors_fatal 8454 1726882430.04211: done checking for any_errors_fatal 8454 1726882430.04212: checking for max_fail_percentage 8454 1726882430.04214: done checking for max_fail_percentage 8454 1726882430.04215: checking to see if all hosts have failed and the running result is not ok 8454 1726882430.04216: done checking to see if all hosts have failed 8454 1726882430.04217: getting the remaining hosts for this loop 8454 1726882430.04219: done getting the remaining hosts for this loop 8454 1726882430.04222: getting the next task for host managed_node3 8454 1726882430.04229: done getting next task for host managed_node3 8454 1726882430.04232: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8454 1726882430.04238: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882430.04243: getting variables 8454 1726882430.04245: in VariableManager get_vars() 8454 1726882430.04294: Calling all_inventory to load vars for managed_node3 8454 1726882430.04298: Calling groups_inventory to load vars for managed_node3 8454 1726882430.04301: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882430.04313: Calling all_plugins_play to load vars for managed_node3 8454 1726882430.04316: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882430.04320: Calling groups_plugins_play to load vars for managed_node3 8454 1726882430.06887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882430.10814: done with get_vars() 8454 1726882430.10980: done getting variables 8454 1726882430.11105: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882430.11302: variable 'profile' from source: include params 8454 1726882430.11307: variable 'item' from source: include params 8454 1726882430.11639: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:50 -0400 (0:00:00.097) 0:00:28.134 ****** 8454 1726882430.11676: entering _queue_task() for managed_node3/command 8454 1726882430.12169: worker is 1 (out of 1 available) 8454 1726882430.12183: exiting _queue_task() for managed_node3/command 8454 1726882430.12195: done queuing things up, now waiting for results queue to drain 8454 1726882430.12197: waiting for pending results... 8454 1726882430.12426: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 8454 1726882430.12606: in run() - task 0affe814-3a2d-f59f-16b9-000000000448 8454 1726882430.12641: variable 'ansible_search_path' from source: unknown 8454 1726882430.12689: variable 'ansible_search_path' from source: unknown 8454 1726882430.12726: calling self._execute() 8454 1726882430.12856: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.12917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.12921: variable 'omit' from source: magic vars 8454 1726882430.13397: variable 'ansible_distribution_major_version' from source: facts 8454 1726882430.13415: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882430.13591: variable 'profile_stat' from source: set_fact 8454 1726882430.13612: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882430.13620: when evaluation is False, skipping this task 8454 1726882430.13701: _execute() done 8454 1726882430.13705: dumping result to json 8454 1726882430.13707: done dumping result, returning 8454 1726882430.13710: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affe814-3a2d-f59f-16b9-000000000448] 8454 1726882430.13712: sending task result for task 0affe814-3a2d-f59f-16b9-000000000448 8454 1726882430.13788: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000448 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882430.13855: no more pending results, returning what we have 8454 1726882430.13860: results queue empty 8454 1726882430.13861: checking for any_errors_fatal 8454 1726882430.13871: done checking for any_errors_fatal 8454 1726882430.13872: checking for max_fail_percentage 8454 1726882430.13874: done checking for max_fail_percentage 8454 1726882430.13876: checking to see if all hosts have failed and the running result is not ok 8454 1726882430.13877: done checking to see if all hosts have failed 8454 1726882430.13878: getting the remaining hosts for this loop 8454 1726882430.13880: done getting the remaining hosts for this loop 8454 1726882430.13885: getting the next task for host managed_node3 8454 1726882430.13893: done getting next task for host managed_node3 8454 1726882430.13896: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8454 1726882430.13903: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882430.13908: getting variables 8454 1726882430.13910: in VariableManager get_vars() 8454 1726882430.14075: Calling all_inventory to load vars for managed_node3 8454 1726882430.14079: Calling groups_inventory to load vars for managed_node3 8454 1726882430.14082: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882430.14100: Calling all_plugins_play to load vars for managed_node3 8454 1726882430.14104: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882430.14108: Calling groups_plugins_play to load vars for managed_node3 8454 1726882430.15011: WORKER PROCESS EXITING 8454 1726882430.17358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882430.20222: done with get_vars() 8454 1726882430.20269: done getting variables 8454 1726882430.20346: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882430.20479: variable 'profile' from source: include params 8454 1726882430.20484: variable 'item' from source: include params 8454 1726882430.20563: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:50 -0400 (0:00:00.089) 0:00:28.223 ****** 8454 1726882430.20599: entering _queue_task() for managed_node3/set_fact 8454 1726882430.20962: worker is 1 (out of 1 available) 8454 1726882430.20976: exiting _queue_task() for managed_node3/set_fact 8454 1726882430.20994: done queuing things up, now waiting for results queue to drain 8454 1726882430.20996: waiting for pending results... 8454 1726882430.21304: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 8454 1726882430.21462: in run() - task 0affe814-3a2d-f59f-16b9-000000000449 8454 1726882430.21486: variable 'ansible_search_path' from source: unknown 8454 1726882430.21494: variable 'ansible_search_path' from source: unknown 8454 1726882430.21546: calling self._execute() 8454 1726882430.21662: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.21675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.21695: variable 'omit' from source: magic vars 8454 1726882430.22182: variable 'ansible_distribution_major_version' from source: facts 8454 1726882430.22221: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882430.22426: variable 'profile_stat' from source: set_fact 8454 1726882430.22449: Evaluated conditional (profile_stat.stat.exists): False 8454 1726882430.22457: when evaluation is False, skipping this task 8454 1726882430.22464: _execute() done 8454 1726882430.22472: dumping result to json 8454 1726882430.22484: done dumping result, returning 8454 1726882430.22501: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affe814-3a2d-f59f-16b9-000000000449] 8454 1726882430.22512: sending task result for task 0affe814-3a2d-f59f-16b9-000000000449 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8454 1726882430.22678: no more pending results, returning what we have 8454 1726882430.22684: results queue empty 8454 1726882430.22685: checking for any_errors_fatal 8454 1726882430.22695: done checking for any_errors_fatal 8454 1726882430.22696: checking for max_fail_percentage 8454 1726882430.22698: done checking for max_fail_percentage 8454 1726882430.22700: checking to see if all hosts have failed and the running result is not ok 8454 1726882430.22701: done checking to see if all hosts have failed 8454 1726882430.22702: getting the remaining hosts for this loop 8454 1726882430.22704: done getting the remaining hosts for this loop 8454 1726882430.22709: getting the next task for host managed_node3 8454 1726882430.22719: done getting next task for host managed_node3 8454 1726882430.22722: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8454 1726882430.22727: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882430.22731: getting variables 8454 1726882430.22733: in VariableManager get_vars() 8454 1726882430.22888: Calling all_inventory to load vars for managed_node3 8454 1726882430.22891: Calling groups_inventory to load vars for managed_node3 8454 1726882430.22895: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882430.22911: Calling all_plugins_play to load vars for managed_node3 8454 1726882430.22914: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882430.22918: Calling groups_plugins_play to load vars for managed_node3 8454 1726882430.23582: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000449 8454 1726882430.23587: WORKER PROCESS EXITING 8454 1726882430.25413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882430.28279: done with get_vars() 8454 1726882430.28313: done getting variables 8454 1726882430.28387: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882430.28517: variable 'profile' from source: include params 8454 1726882430.28521: variable 'item' from source: include params 8454 1726882430.28684: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:50 -0400 (0:00:00.081) 0:00:28.304 ****** 8454 1726882430.28727: entering _queue_task() for managed_node3/assert 8454 1726882430.29138: worker is 1 (out of 1 available) 8454 1726882430.29152: exiting _queue_task() for managed_node3/assert 8454 1726882430.29164: done queuing things up, now waiting for results queue to drain 8454 1726882430.29166: waiting for pending results... 8454 1726882430.29364: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 8454 1726882430.29487: in run() - task 0affe814-3a2d-f59f-16b9-00000000026e 8454 1726882430.29513: variable 'ansible_search_path' from source: unknown 8454 1726882430.29560: variable 'ansible_search_path' from source: unknown 8454 1726882430.29574: calling self._execute() 8454 1726882430.29702: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.29710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.29723: variable 'omit' from source: magic vars 8454 1726882430.30217: variable 'ansible_distribution_major_version' from source: facts 8454 1726882430.30221: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882430.30224: variable 'omit' from source: magic vars 8454 1726882430.30228: variable 'omit' from source: magic vars 8454 1726882430.30310: variable 'profile' from source: include params 8454 1726882430.30314: variable 'item' from source: include params 8454 1726882430.30394: variable 'item' from source: include params 8454 1726882430.30416: variable 'omit' from source: magic vars 8454 1726882430.30465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882430.30508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882430.30529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882430.30552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882430.30565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882430.30601: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882430.30604: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.30610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.30732: Set connection var ansible_connection to ssh 8454 1726882430.30761: Set connection var ansible_shell_executable to /bin/sh 8454 1726882430.30764: Set connection var ansible_timeout to 10 8454 1726882430.30767: Set connection var ansible_shell_type to sh 8454 1726882430.30769: Set connection var ansible_pipelining to False 8454 1726882430.30772: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882430.30872: variable 'ansible_shell_executable' from source: unknown 8454 1726882430.30875: variable 'ansible_connection' from source: unknown 8454 1726882430.30878: variable 'ansible_module_compression' from source: unknown 8454 1726882430.30880: variable 'ansible_shell_type' from source: unknown 8454 1726882430.30882: variable 'ansible_shell_executable' from source: unknown 8454 1726882430.30884: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.30887: variable 'ansible_pipelining' from source: unknown 8454 1726882430.30890: variable 'ansible_timeout' from source: unknown 8454 1726882430.30892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.30990: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882430.31003: variable 'omit' from source: magic vars 8454 1726882430.31009: starting attempt loop 8454 1726882430.31012: running the handler 8454 1726882430.31147: variable 'lsr_net_profile_exists' from source: set_fact 8454 1726882430.31153: Evaluated conditional (lsr_net_profile_exists): True 8454 1726882430.31160: handler run complete 8454 1726882430.31180: attempt loop complete, returning result 8454 1726882430.31197: _execute() done 8454 1726882430.31201: dumping result to json 8454 1726882430.31204: done dumping result, returning 8454 1726882430.31206: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [0affe814-3a2d-f59f-16b9-00000000026e] 8454 1726882430.31209: sending task result for task 0affe814-3a2d-f59f-16b9-00000000026e 8454 1726882430.31372: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000026e 8454 1726882430.31376: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882430.31451: no more pending results, returning what we have 8454 1726882430.31455: results queue empty 8454 1726882430.31456: checking for any_errors_fatal 8454 1726882430.31462: done checking for any_errors_fatal 8454 1726882430.31463: checking for max_fail_percentage 8454 1726882430.31465: done checking for max_fail_percentage 8454 1726882430.31466: checking to see if all hosts have failed and the running result is not ok 8454 1726882430.31467: done checking to see if all hosts have failed 8454 1726882430.31467: getting the remaining hosts for this loop 8454 1726882430.31469: done getting the remaining hosts for this loop 8454 1726882430.31472: getting the next task for host managed_node3 8454 1726882430.31477: done getting next task for host managed_node3 8454 1726882430.31480: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8454 1726882430.31483: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882430.31486: getting variables 8454 1726882430.31488: in VariableManager get_vars() 8454 1726882430.31525: Calling all_inventory to load vars for managed_node3 8454 1726882430.31528: Calling groups_inventory to load vars for managed_node3 8454 1726882430.31531: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882430.31547: Calling all_plugins_play to load vars for managed_node3 8454 1726882430.31551: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882430.31555: Calling groups_plugins_play to load vars for managed_node3 8454 1726882430.33679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882430.37082: done with get_vars() 8454 1726882430.37120: done getting variables 8454 1726882430.37315: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882430.37837: variable 'profile' from source: include params 8454 1726882430.37842: variable 'item' from source: include params 8454 1726882430.37922: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:50 -0400 (0:00:00.092) 0:00:28.397 ****** 8454 1726882430.38037: entering _queue_task() for managed_node3/assert 8454 1726882430.38685: worker is 1 (out of 1 available) 8454 1726882430.38698: exiting _queue_task() for managed_node3/assert 8454 1726882430.38712: done queuing things up, now waiting for results queue to drain 8454 1726882430.38714: waiting for pending results... 8454 1726882430.39157: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 8454 1726882430.39313: in run() - task 0affe814-3a2d-f59f-16b9-00000000026f 8454 1726882430.39317: variable 'ansible_search_path' from source: unknown 8454 1726882430.39320: variable 'ansible_search_path' from source: unknown 8454 1726882430.39422: calling self._execute() 8454 1726882430.39477: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.39491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.39508: variable 'omit' from source: magic vars 8454 1726882430.39952: variable 'ansible_distribution_major_version' from source: facts 8454 1726882430.39982: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882430.39995: variable 'omit' from source: magic vars 8454 1726882430.40049: variable 'omit' from source: magic vars 8454 1726882430.40188: variable 'profile' from source: include params 8454 1726882430.40200: variable 'item' from source: include params 8454 1726882430.40298: variable 'item' from source: include params 8454 1726882430.40317: variable 'omit' from source: magic vars 8454 1726882430.40396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882430.40424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882430.40453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882430.40479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882430.40515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882430.40552: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882430.40613: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.40621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.40704: Set connection var ansible_connection to ssh 8454 1726882430.40725: Set connection var ansible_shell_executable to /bin/sh 8454 1726882430.40745: Set connection var ansible_timeout to 10 8454 1726882430.40753: Set connection var ansible_shell_type to sh 8454 1726882430.40767: Set connection var ansible_pipelining to False 8454 1726882430.40778: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882430.40806: variable 'ansible_shell_executable' from source: unknown 8454 1726882430.40814: variable 'ansible_connection' from source: unknown 8454 1726882430.40822: variable 'ansible_module_compression' from source: unknown 8454 1726882430.40840: variable 'ansible_shell_type' from source: unknown 8454 1726882430.40939: variable 'ansible_shell_executable' from source: unknown 8454 1726882430.40942: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.40950: variable 'ansible_pipelining' from source: unknown 8454 1726882430.40953: variable 'ansible_timeout' from source: unknown 8454 1726882430.40955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.41207: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882430.41288: variable 'omit' from source: magic vars 8454 1726882430.41310: starting attempt loop 8454 1726882430.41332: running the handler 8454 1726882430.42252: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8454 1726882430.42256: Evaluated conditional (lsr_net_profile_ansible_managed): True 8454 1726882430.42259: handler run complete 8454 1726882430.42261: attempt loop complete, returning result 8454 1726882430.42264: _execute() done 8454 1726882430.42267: dumping result to json 8454 1726882430.42269: done dumping result, returning 8454 1726882430.42272: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affe814-3a2d-f59f-16b9-00000000026f] 8454 1726882430.42274: sending task result for task 0affe814-3a2d-f59f-16b9-00000000026f 8454 1726882430.42342: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000026f 8454 1726882430.42345: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882430.42406: no more pending results, returning what we have 8454 1726882430.42409: results queue empty 8454 1726882430.42411: checking for any_errors_fatal 8454 1726882430.42419: done checking for any_errors_fatal 8454 1726882430.42420: checking for max_fail_percentage 8454 1726882430.42422: done checking for max_fail_percentage 8454 1726882430.42423: checking to see if all hosts have failed and the running result is not ok 8454 1726882430.42424: done checking to see if all hosts have failed 8454 1726882430.42425: getting the remaining hosts for this loop 8454 1726882430.42428: done getting the remaining hosts for this loop 8454 1726882430.42433: getting the next task for host managed_node3 8454 1726882430.42442: done getting next task for host managed_node3 8454 1726882430.42445: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8454 1726882430.42450: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882430.42455: getting variables 8454 1726882430.42457: in VariableManager get_vars() 8454 1726882430.42505: Calling all_inventory to load vars for managed_node3 8454 1726882430.42509: Calling groups_inventory to load vars for managed_node3 8454 1726882430.42512: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882430.42525: Calling all_plugins_play to load vars for managed_node3 8454 1726882430.42528: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882430.42532: Calling groups_plugins_play to load vars for managed_node3 8454 1726882430.47862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882430.53754: done with get_vars() 8454 1726882430.53795: done getting variables 8454 1726882430.53865: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882430.54195: variable 'profile' from source: include params 8454 1726882430.54200: variable 'item' from source: include params 8454 1726882430.54274: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:50 -0400 (0:00:00.163) 0:00:28.560 ****** 8454 1726882430.54313: entering _queue_task() for managed_node3/assert 8454 1726882430.55074: worker is 1 (out of 1 available) 8454 1726882430.55085: exiting _queue_task() for managed_node3/assert 8454 1726882430.55098: done queuing things up, now waiting for results queue to drain 8454 1726882430.55099: waiting for pending results... 8454 1726882430.55267: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 8454 1726882430.55409: in run() - task 0affe814-3a2d-f59f-16b9-000000000270 8454 1726882430.55439: variable 'ansible_search_path' from source: unknown 8454 1726882430.55484: variable 'ansible_search_path' from source: unknown 8454 1726882430.55522: calling self._execute() 8454 1726882430.55660: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.55740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.55744: variable 'omit' from source: magic vars 8454 1726882430.56161: variable 'ansible_distribution_major_version' from source: facts 8454 1726882430.56181: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882430.56202: variable 'omit' from source: magic vars 8454 1726882430.56256: variable 'omit' from source: magic vars 8454 1726882430.56389: variable 'profile' from source: include params 8454 1726882430.56418: variable 'item' from source: include params 8454 1726882430.56492: variable 'item' from source: include params 8454 1726882430.56624: variable 'omit' from source: magic vars 8454 1726882430.56629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882430.56632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882430.56659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882430.56686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882430.56706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882430.56769: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882430.56780: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.56789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.56923: Set connection var ansible_connection to ssh 8454 1726882430.56941: Set connection var ansible_shell_executable to /bin/sh 8454 1726882430.56959: Set connection var ansible_timeout to 10 8454 1726882430.56969: Set connection var ansible_shell_type to sh 8454 1726882430.56983: Set connection var ansible_pipelining to False 8454 1726882430.56995: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882430.57023: variable 'ansible_shell_executable' from source: unknown 8454 1726882430.57031: variable 'ansible_connection' from source: unknown 8454 1726882430.57040: variable 'ansible_module_compression' from source: unknown 8454 1726882430.57048: variable 'ansible_shell_type' from source: unknown 8454 1726882430.57059: variable 'ansible_shell_executable' from source: unknown 8454 1726882430.57071: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.57082: variable 'ansible_pipelining' from source: unknown 8454 1726882430.57091: variable 'ansible_timeout' from source: unknown 8454 1726882430.57157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.57292: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882430.57311: variable 'omit' from source: magic vars 8454 1726882430.57322: starting attempt loop 8454 1726882430.57329: running the handler 8454 1726882430.57471: variable 'lsr_net_profile_fingerprint' from source: set_fact 8454 1726882430.57482: Evaluated conditional (lsr_net_profile_fingerprint): True 8454 1726882430.57497: handler run complete 8454 1726882430.57525: attempt loop complete, returning result 8454 1726882430.57532: _execute() done 8454 1726882430.57601: dumping result to json 8454 1726882430.57605: done dumping result, returning 8454 1726882430.57607: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [0affe814-3a2d-f59f-16b9-000000000270] 8454 1726882430.57609: sending task result for task 0affe814-3a2d-f59f-16b9-000000000270 8454 1726882430.57685: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000270 8454 1726882430.57689: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 8454 1726882430.57763: no more pending results, returning what we have 8454 1726882430.57767: results queue empty 8454 1726882430.57769: checking for any_errors_fatal 8454 1726882430.57779: done checking for any_errors_fatal 8454 1726882430.57780: checking for max_fail_percentage 8454 1726882430.57783: done checking for max_fail_percentage 8454 1726882430.57784: checking to see if all hosts have failed and the running result is not ok 8454 1726882430.57785: done checking to see if all hosts have failed 8454 1726882430.57786: getting the remaining hosts for this loop 8454 1726882430.57788: done getting the remaining hosts for this loop 8454 1726882430.57793: getting the next task for host managed_node3 8454 1726882430.57802: done getting next task for host managed_node3 8454 1726882430.57806: ^ task is: TASK: ** TEST check polling interval 8454 1726882430.57809: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882430.57814: getting variables 8454 1726882430.57816: in VariableManager get_vars() 8454 1726882430.57978: Calling all_inventory to load vars for managed_node3 8454 1726882430.57982: Calling groups_inventory to load vars for managed_node3 8454 1726882430.57985: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882430.58000: Calling all_plugins_play to load vars for managed_node3 8454 1726882430.58004: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882430.58008: Calling groups_plugins_play to load vars for managed_node3 8454 1726882430.61473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882430.67671: done with get_vars() 8454 1726882430.67715: done getting variables 8454 1726882430.67786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Friday 20 September 2024 21:33:50 -0400 (0:00:00.135) 0:00:28.695 ****** 8454 1726882430.67820: entering _queue_task() for managed_node3/command 8454 1726882430.68580: worker is 1 (out of 1 available) 8454 1726882430.68596: exiting _queue_task() for managed_node3/command 8454 1726882430.68612: done queuing things up, now waiting for results queue to drain 8454 1726882430.68613: waiting for pending results... 8454 1726882430.69077: running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval 8454 1726882430.69172: in run() - task 0affe814-3a2d-f59f-16b9-000000000071 8454 1726882430.69177: variable 'ansible_search_path' from source: unknown 8454 1726882430.69486: calling self._execute() 8454 1726882430.69595: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.69605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.69639: variable 'omit' from source: magic vars 8454 1726882430.70299: variable 'ansible_distribution_major_version' from source: facts 8454 1726882430.70303: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882430.70305: variable 'omit' from source: magic vars 8454 1726882430.70307: variable 'omit' from source: magic vars 8454 1726882430.70624: variable 'controller_device' from source: play vars 8454 1726882430.70852: variable 'omit' from source: magic vars 8454 1726882430.70916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882430.71097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882430.71100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882430.71152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882430.71165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882430.71321: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882430.71325: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.71330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.71583: Set connection var ansible_connection to ssh 8454 1726882430.71595: Set connection var ansible_shell_executable to /bin/sh 8454 1726882430.71603: Set connection var ansible_timeout to 10 8454 1726882430.71606: Set connection var ansible_shell_type to sh 8454 1726882430.71618: Set connection var ansible_pipelining to False 8454 1726882430.71746: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882430.71771: variable 'ansible_shell_executable' from source: unknown 8454 1726882430.71775: variable 'ansible_connection' from source: unknown 8454 1726882430.71778: variable 'ansible_module_compression' from source: unknown 8454 1726882430.71784: variable 'ansible_shell_type' from source: unknown 8454 1726882430.71789: variable 'ansible_shell_executable' from source: unknown 8454 1726882430.71794: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882430.71799: variable 'ansible_pipelining' from source: unknown 8454 1726882430.71801: variable 'ansible_timeout' from source: unknown 8454 1726882430.71808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882430.72188: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882430.72204: variable 'omit' from source: magic vars 8454 1726882430.72222: starting attempt loop 8454 1726882430.72226: running the handler 8454 1726882430.72456: _low_level_execute_command(): starting 8454 1726882430.72459: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882430.74026: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882430.74030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882430.74215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882430.76105: stdout chunk (state=3): >>>/root <<< 8454 1726882430.76176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882430.76515: stderr chunk (state=3): >>><<< 8454 1726882430.76518: stdout chunk (state=3): >>><<< 8454 1726882430.76522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882430.76524: _low_level_execute_command(): starting 8454 1726882430.76527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210 `" && echo ansible-tmp-1726882430.764459-9485-107106978488210="` echo /root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210 `" ) && sleep 0' 8454 1726882430.77699: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882430.77712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882430.77722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882430.77739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882430.77754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882430.77764: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882430.77778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882430.77796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882430.77893: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882430.77905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882430.78056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882430.80169: stdout chunk (state=3): >>>ansible-tmp-1726882430.764459-9485-107106978488210=/root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210 <<< 8454 1726882430.80355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882430.80373: stderr chunk (state=3): >>><<< 8454 1726882430.80382: stdout chunk (state=3): >>><<< 8454 1726882430.80405: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882430.764459-9485-107106978488210=/root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882430.80447: variable 'ansible_module_compression' from source: unknown 8454 1726882430.80515: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882430.80739: variable 'ansible_facts' from source: unknown 8454 1726882430.80743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/AnsiballZ_command.py 8454 1726882430.80884: Sending initial data 8454 1726882430.80888: Sent initial data (153 bytes) 8454 1726882430.81411: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882430.81421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882430.81532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882430.81577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882430.81684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882430.83442: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882430.83559: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882430.83664: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp5xfcmisf /root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/AnsiballZ_command.py <<< 8454 1726882430.83674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/AnsiballZ_command.py" <<< 8454 1726882430.83803: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp5xfcmisf" to remote "/root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/AnsiballZ_command.py" <<< 8454 1726882430.85377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882430.85394: stdout chunk (state=3): >>><<< 8454 1726882430.85411: stderr chunk (state=3): >>><<< 8454 1726882430.85538: done transferring module to remote 8454 1726882430.85541: _low_level_execute_command(): starting 8454 1726882430.85544: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/ /root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/AnsiballZ_command.py && sleep 0' 8454 1726882430.86174: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882430.86178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882430.86180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882430.86183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882430.86185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882430.86187: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882430.86189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882430.86191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882430.86193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882430.86227: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882430.86474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882430.86586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882430.88572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882430.88576: stdout chunk (state=3): >>><<< 8454 1726882430.88740: stderr chunk (state=3): >>><<< 8454 1726882430.88744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882430.88747: _low_level_execute_command(): starting 8454 1726882430.88750: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/AnsiballZ_command.py && sleep 0' 8454 1726882430.89248: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882430.89258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882430.89270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882430.89298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882430.89307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882430.89315: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882430.89326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882430.89350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882430.89358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882430.89366: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8454 1726882430.89375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882430.89394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882430.89409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882430.89417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882430.89426: stderr chunk (state=3): >>>debug2: match found <<< 8454 1726882430.89437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882430.89518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882430.89536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882430.89549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882430.89687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.07203: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:33:51.066431", "end": "2024-09-20 21:33:51.069998", "delta": "0:00:00.003567", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882431.08841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882431.09043: stderr chunk (state=3): >>><<< 8454 1726882431.09047: stdout chunk (state=3): >>><<< 8454 1726882431.09050: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:33:51.066431", "end": "2024-09-20 21:33:51.069998", "delta": "0:00:00.003567", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882431.09053: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882431.09055: _low_level_execute_command(): starting 8454 1726882431.09058: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882430.764459-9485-107106978488210/ > /dev/null 2>&1 && sleep 0' 8454 1726882431.09628: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882431.09642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.09650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882431.09668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882431.09689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882431.09697: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882431.09709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.09724: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882431.09733: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882431.09743: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8454 1726882431.09752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.09762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882431.09775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882431.09788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882431.09907: stderr chunk (state=3): >>>debug2: match found <<< 8454 1726882431.09914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.09917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882431.09919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882431.10024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.10252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.12148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.12197: stderr chunk (state=3): >>><<< 8454 1726882431.12204: stdout chunk (state=3): >>><<< 8454 1726882431.12219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882431.12227: handler run complete 8454 1726882431.12257: Evaluated conditional (False): False 8454 1726882431.12388: variable 'result' from source: unknown 8454 1726882431.12404: Evaluated conditional ('110' in result.stdout): True 8454 1726882431.12418: attempt loop complete, returning result 8454 1726882431.12421: _execute() done 8454 1726882431.12424: dumping result to json 8454 1726882431.12431: done dumping result, returning 8454 1726882431.12441: done running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval [0affe814-3a2d-f59f-16b9-000000000071] 8454 1726882431.12447: sending task result for task 0affe814-3a2d-f59f-16b9-000000000071 8454 1726882431.12556: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000071 8454 1726882431.12559: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003567", "end": "2024-09-20 21:33:51.069998", "rc": 0, "start": "2024-09-20 21:33:51.066431" } STDOUT: MII Polling Interval (ms): 110 8454 1726882431.12651: no more pending results, returning what we have 8454 1726882431.12656: results queue empty 8454 1726882431.12657: checking for any_errors_fatal 8454 1726882431.12664: done checking for any_errors_fatal 8454 1726882431.12665: checking for max_fail_percentage 8454 1726882431.12669: done checking for max_fail_percentage 8454 1726882431.12671: checking to see if all hosts have failed and the running result is not ok 8454 1726882431.12671: done checking to see if all hosts have failed 8454 1726882431.12672: getting the remaining hosts for this loop 8454 1726882431.12674: done getting the remaining hosts for this loop 8454 1726882431.12679: getting the next task for host managed_node3 8454 1726882431.12685: done getting next task for host managed_node3 8454 1726882431.12693: ^ task is: TASK: ** TEST check IPv4 8454 1726882431.12699: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882431.12703: getting variables 8454 1726882431.12705: in VariableManager get_vars() 8454 1726882431.12871: Calling all_inventory to load vars for managed_node3 8454 1726882431.12874: Calling groups_inventory to load vars for managed_node3 8454 1726882431.12877: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882431.12889: Calling all_plugins_play to load vars for managed_node3 8454 1726882431.12893: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882431.12896: Calling groups_plugins_play to load vars for managed_node3 8454 1726882431.15349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882431.18263: done with get_vars() 8454 1726882431.18299: done getting variables 8454 1726882431.18372: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Friday 20 September 2024 21:33:51 -0400 (0:00:00.505) 0:00:29.201 ****** 8454 1726882431.18405: entering _queue_task() for managed_node3/command 8454 1726882431.18715: worker is 1 (out of 1 available) 8454 1726882431.18728: exiting _queue_task() for managed_node3/command 8454 1726882431.18944: done queuing things up, now waiting for results queue to drain 8454 1726882431.18946: waiting for pending results... 8454 1726882431.19156: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 8454 1726882431.19162: in run() - task 0affe814-3a2d-f59f-16b9-000000000072 8454 1726882431.19183: variable 'ansible_search_path' from source: unknown 8454 1726882431.19238: calling self._execute() 8454 1726882431.19364: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882431.19380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882431.19500: variable 'omit' from source: magic vars 8454 1726882431.19933: variable 'ansible_distribution_major_version' from source: facts 8454 1726882431.19957: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882431.19970: variable 'omit' from source: magic vars 8454 1726882431.20001: variable 'omit' from source: magic vars 8454 1726882431.20127: variable 'controller_device' from source: play vars 8454 1726882431.20167: variable 'omit' from source: magic vars 8454 1726882431.20222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882431.20381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882431.20384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882431.20387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882431.20389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882431.20395: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882431.20404: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882431.20414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882431.20562: Set connection var ansible_connection to ssh 8454 1726882431.20580: Set connection var ansible_shell_executable to /bin/sh 8454 1726882431.20606: Set connection var ansible_timeout to 10 8454 1726882431.20640: Set connection var ansible_shell_type to sh 8454 1726882431.20666: Set connection var ansible_pipelining to False 8454 1726882431.20678: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882431.20737: variable 'ansible_shell_executable' from source: unknown 8454 1726882431.20819: variable 'ansible_connection' from source: unknown 8454 1726882431.20823: variable 'ansible_module_compression' from source: unknown 8454 1726882431.20825: variable 'ansible_shell_type' from source: unknown 8454 1726882431.20929: variable 'ansible_shell_executable' from source: unknown 8454 1726882431.20932: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882431.20937: variable 'ansible_pipelining' from source: unknown 8454 1726882431.20939: variable 'ansible_timeout' from source: unknown 8454 1726882431.20942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882431.21041: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882431.21061: variable 'omit' from source: magic vars 8454 1726882431.21071: starting attempt loop 8454 1726882431.21077: running the handler 8454 1726882431.21098: _low_level_execute_command(): starting 8454 1726882431.21110: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882431.21866: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882431.21880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.21915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.21936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882431.22030: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.22053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882431.22071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882431.22101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.22244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.24039: stdout chunk (state=3): >>>/root <<< 8454 1726882431.24239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.24243: stdout chunk (state=3): >>><<< 8454 1726882431.24245: stderr chunk (state=3): >>><<< 8454 1726882431.24265: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882431.24285: _low_level_execute_command(): starting 8454 1726882431.24374: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870 `" && echo ansible-tmp-1726882431.2427242-9530-223724954628870="` echo /root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870 `" ) && sleep 0' 8454 1726882431.24971: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882431.25029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882431.25039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882431.25043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882431.25064: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882431.25074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8454 1726882431.25080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.25097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882431.25181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.25251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.25348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.27428: stdout chunk (state=3): >>>ansible-tmp-1726882431.2427242-9530-223724954628870=/root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870 <<< 8454 1726882431.27707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.27711: stderr chunk (state=3): >>><<< 8454 1726882431.27714: stdout chunk (state=3): >>><<< 8454 1726882431.27716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882431.2427242-9530-223724954628870=/root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882431.27719: variable 'ansible_module_compression' from source: unknown 8454 1726882431.27854: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882431.27857: variable 'ansible_facts' from source: unknown 8454 1726882431.27860: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/AnsiballZ_command.py 8454 1726882431.28164: Sending initial data 8454 1726882431.28170: Sent initial data (154 bytes) 8454 1726882431.28555: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.28562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882431.28569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882431.28576: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882431.28582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.28613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.28616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882431.28619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.28674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882431.28679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.28795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.30489: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882431.30592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882431.30730: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmplrctpwge /root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/AnsiballZ_command.py <<< 8454 1726882431.30733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/AnsiballZ_command.py" <<< 8454 1726882431.30837: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmplrctpwge" to remote "/root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/AnsiballZ_command.py" <<< 8454 1726882431.31941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.31945: stderr chunk (state=3): >>><<< 8454 1726882431.31948: stdout chunk (state=3): >>><<< 8454 1726882431.31963: done transferring module to remote 8454 1726882431.31973: _low_level_execute_command(): starting 8454 1726882431.31980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/ /root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/AnsiballZ_command.py && sleep 0' 8454 1726882431.32407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882431.32411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.32413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.32418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882431.32421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.32466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882431.32473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.32585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.34497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.34547: stderr chunk (state=3): >>><<< 8454 1726882431.34550: stdout chunk (state=3): >>><<< 8454 1726882431.34560: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882431.34616: _low_level_execute_command(): starting 8454 1726882431.34624: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/AnsiballZ_command.py && sleep 0' 8454 1726882431.34994: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.34998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882431.35000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882431.35003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882431.35005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.35059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882431.35063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.35188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.52916: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.231/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 233sec preferred_lft 233sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:51.523153", "end": "2024-09-20 21:33:51.526882", "delta": "0:00:00.003729", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882431.54767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882431.54772: stdout chunk (state=3): >>><<< 8454 1726882431.54775: stderr chunk (state=3): >>><<< 8454 1726882431.54950: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.231/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 233sec preferred_lft 233sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:51.523153", "end": "2024-09-20 21:33:51.526882", "delta": "0:00:00.003729", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882431.54955: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882431.54958: _low_level_execute_command(): starting 8454 1726882431.54961: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882431.2427242-9530-223724954628870/ > /dev/null 2>&1 && sleep 0' 8454 1726882431.55755: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882431.55863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.55944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.56097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.58154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.58200: stderr chunk (state=3): >>><<< 8454 1726882431.58204: stdout chunk (state=3): >>><<< 8454 1726882431.58216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882431.58223: handler run complete 8454 1726882431.58251: Evaluated conditional (False): False 8454 1726882431.58383: variable 'result' from source: set_fact 8454 1726882431.58397: Evaluated conditional ('192.0.2' in result.stdout): True 8454 1726882431.58409: attempt loop complete, returning result 8454 1726882431.58412: _execute() done 8454 1726882431.58417: dumping result to json 8454 1726882431.58423: done dumping result, returning 8454 1726882431.58432: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0affe814-3a2d-f59f-16b9-000000000072] 8454 1726882431.58440: sending task result for task 0affe814-3a2d-f59f-16b9-000000000072 8454 1726882431.58554: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000072 8454 1726882431.58557: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003729", "end": "2024-09-20 21:33:51.526882", "rc": 0, "start": "2024-09-20 21:33:51.523153" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.231/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 233sec preferred_lft 233sec 8454 1726882431.58654: no more pending results, returning what we have 8454 1726882431.58658: results queue empty 8454 1726882431.58659: checking for any_errors_fatal 8454 1726882431.58667: done checking for any_errors_fatal 8454 1726882431.58668: checking for max_fail_percentage 8454 1726882431.58671: done checking for max_fail_percentage 8454 1726882431.58672: checking to see if all hosts have failed and the running result is not ok 8454 1726882431.58673: done checking to see if all hosts have failed 8454 1726882431.58674: getting the remaining hosts for this loop 8454 1726882431.58676: done getting the remaining hosts for this loop 8454 1726882431.58682: getting the next task for host managed_node3 8454 1726882431.58688: done getting next task for host managed_node3 8454 1726882431.58691: ^ task is: TASK: ** TEST check IPv6 8454 1726882431.58694: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882431.58698: getting variables 8454 1726882431.58699: in VariableManager get_vars() 8454 1726882431.58753: Calling all_inventory to load vars for managed_node3 8454 1726882431.58757: Calling groups_inventory to load vars for managed_node3 8454 1726882431.58760: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882431.58772: Calling all_plugins_play to load vars for managed_node3 8454 1726882431.58775: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882431.58781: Calling groups_plugins_play to load vars for managed_node3 8454 1726882431.60791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882431.64737: done with get_vars() 8454 1726882431.64774: done getting variables 8454 1726882431.64846: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Friday 20 September 2024 21:33:51 -0400 (0:00:00.464) 0:00:29.666 ****** 8454 1726882431.64881: entering _queue_task() for managed_node3/command 8454 1726882431.65227: worker is 1 (out of 1 available) 8454 1726882431.65247: exiting _queue_task() for managed_node3/command 8454 1726882431.65457: done queuing things up, now waiting for results queue to drain 8454 1726882431.65459: waiting for pending results... 8454 1726882431.66155: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 8454 1726882431.66228: in run() - task 0affe814-3a2d-f59f-16b9-000000000073 8454 1726882431.66256: variable 'ansible_search_path' from source: unknown 8454 1726882431.66539: calling self._execute() 8454 1726882431.66598: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882431.66612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882431.66653: variable 'omit' from source: magic vars 8454 1726882431.67539: variable 'ansible_distribution_major_version' from source: facts 8454 1726882431.67562: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882431.67574: variable 'omit' from source: magic vars 8454 1726882431.67941: variable 'omit' from source: magic vars 8454 1726882431.67945: variable 'controller_device' from source: play vars 8454 1726882431.67948: variable 'omit' from source: magic vars 8454 1726882431.68079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882431.68126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882431.68160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882431.68265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882431.68308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882431.68376: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882431.68654: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882431.68658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882431.68754: Set connection var ansible_connection to ssh 8454 1726882431.68792: Set connection var ansible_shell_executable to /bin/sh 8454 1726882431.68852: Set connection var ansible_timeout to 10 8454 1726882431.68862: Set connection var ansible_shell_type to sh 8454 1726882431.68882: Set connection var ansible_pipelining to False 8454 1726882431.68902: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882431.69110: variable 'ansible_shell_executable' from source: unknown 8454 1726882431.69114: variable 'ansible_connection' from source: unknown 8454 1726882431.69117: variable 'ansible_module_compression' from source: unknown 8454 1726882431.69120: variable 'ansible_shell_type' from source: unknown 8454 1726882431.69122: variable 'ansible_shell_executable' from source: unknown 8454 1726882431.69125: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882431.69127: variable 'ansible_pipelining' from source: unknown 8454 1726882431.69129: variable 'ansible_timeout' from source: unknown 8454 1726882431.69131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882431.69462: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882431.69482: variable 'omit' from source: magic vars 8454 1726882431.69495: starting attempt loop 8454 1726882431.69504: running the handler 8454 1726882431.69527: _low_level_execute_command(): starting 8454 1726882431.69799: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882431.70902: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.71010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.71185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882431.71199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.71341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.73181: stdout chunk (state=3): >>>/root <<< 8454 1726882431.73282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.73346: stderr chunk (state=3): >>><<< 8454 1726882431.73357: stdout chunk (state=3): >>><<< 8454 1726882431.73417: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882431.73443: _low_level_execute_command(): starting 8454 1726882431.73590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824 `" && echo ansible-tmp-1726882431.7342558-9558-247865274242824="` echo /root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824 `" ) && sleep 0' 8454 1726882431.74672: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882431.74675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.74678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.74687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.74740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882431.74865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.74996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.77264: stdout chunk (state=3): >>>ansible-tmp-1726882431.7342558-9558-247865274242824=/root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824 <<< 8454 1726882431.77268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.77339: stderr chunk (state=3): >>><<< 8454 1726882431.77342: stdout chunk (state=3): >>><<< 8454 1726882431.77541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882431.7342558-9558-247865274242824=/root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882431.77545: variable 'ansible_module_compression' from source: unknown 8454 1726882431.77548: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882431.77551: variable 'ansible_facts' from source: unknown 8454 1726882431.77890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/AnsiballZ_command.py 8454 1726882431.78632: Sending initial data 8454 1726882431.78641: Sent initial data (154 bytes) 8454 1726882431.79852: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.80147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.82159: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 8454 1726882431.82173: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882431.82274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882431.82390: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp19ib5hbd /root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/AnsiballZ_command.py <<< 8454 1726882431.82407: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/AnsiballZ_command.py" <<< 8454 1726882431.82507: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp19ib5hbd" to remote "/root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/AnsiballZ_command.py" <<< 8454 1726882431.85678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.85682: stdout chunk (state=3): >>><<< 8454 1726882431.85685: stderr chunk (state=3): >>><<< 8454 1726882431.85688: done transferring module to remote 8454 1726882431.85690: _low_level_execute_command(): starting 8454 1726882431.85693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/ /root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/AnsiballZ_command.py && sleep 0' 8454 1726882431.86755: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882431.86769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8454 1726882431.86781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882431.86846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882431.87162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.87285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882431.89380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882431.89398: stdout chunk (state=3): >>><<< 8454 1726882431.89405: stderr chunk (state=3): >>><<< 8454 1726882431.89412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882431.89415: _low_level_execute_command(): starting 8454 1726882431.89421: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/AnsiballZ_command.py && sleep 0' 8454 1726882431.90568: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882431.90848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882431.90866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882431.91018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882432.08810: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::a4/128 scope global dynamic noprefixroute \n valid_lft 233sec preferred_lft 233sec\n inet6 2001:db8::e418:d587:b013:e3c3/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::3bf5:8b54:4141:5e72/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:52.081340", "end": "2024-09-20 21:33:52.085109", "delta": "0:00:00.003769", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882432.10343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882432.10350: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 8454 1726882432.10457: stderr chunk (state=3): >>><<< 8454 1726882432.10467: stdout chunk (state=3): >>><<< 8454 1726882432.10491: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::a4/128 scope global dynamic noprefixroute \n valid_lft 233sec preferred_lft 233sec\n inet6 2001:db8::e418:d587:b013:e3c3/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::3bf5:8b54:4141:5e72/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:52.081340", "end": "2024-09-20 21:33:52.085109", "delta": "0:00:00.003769", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882432.10546: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882432.10557: _low_level_execute_command(): starting 8454 1726882432.10563: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882431.7342558-9558-247865274242824/ > /dev/null 2>&1 && sleep 0' 8454 1726882432.11962: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882432.12018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882432.12038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882432.12061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882432.12221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882432.12356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882432.12504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882432.14741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882432.14746: stdout chunk (state=3): >>><<< 8454 1726882432.14748: stderr chunk (state=3): >>><<< 8454 1726882432.14944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882432.14948: handler run complete 8454 1726882432.14950: Evaluated conditional (False): False 8454 1726882432.15218: variable 'result' from source: set_fact 8454 1726882432.15295: Evaluated conditional ('2001' in result.stdout): True 8454 1726882432.15400: attempt loop complete, returning result 8454 1726882432.15409: _execute() done 8454 1726882432.15417: dumping result to json 8454 1726882432.15428: done dumping result, returning 8454 1726882432.15445: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0affe814-3a2d-f59f-16b9-000000000073] 8454 1726882432.15456: sending task result for task 0affe814-3a2d-f59f-16b9-000000000073 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003769", "end": "2024-09-20 21:33:52.085109", "rc": 0, "start": "2024-09-20 21:33:52.081340" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::a4/128 scope global dynamic noprefixroute valid_lft 233sec preferred_lft 233sec inet6 2001:db8::e418:d587:b013:e3c3/64 scope global dynamic noprefixroute valid_lft 1799sec preferred_lft 1799sec inet6 fe80::3bf5:8b54:4141:5e72/64 scope link noprefixroute valid_lft forever preferred_lft forever 8454 1726882432.15778: no more pending results, returning what we have 8454 1726882432.15783: results queue empty 8454 1726882432.15784: checking for any_errors_fatal 8454 1726882432.15792: done checking for any_errors_fatal 8454 1726882432.15794: checking for max_fail_percentage 8454 1726882432.15796: done checking for max_fail_percentage 8454 1726882432.15797: checking to see if all hosts have failed and the running result is not ok 8454 1726882432.15798: done checking to see if all hosts have failed 8454 1726882432.15799: getting the remaining hosts for this loop 8454 1726882432.15801: done getting the remaining hosts for this loop 8454 1726882432.15806: getting the next task for host managed_node3 8454 1726882432.15819: done getting next task for host managed_node3 8454 1726882432.15825: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8454 1726882432.15830: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882432.16258: getting variables 8454 1726882432.16260: in VariableManager get_vars() 8454 1726882432.16313: Calling all_inventory to load vars for managed_node3 8454 1726882432.16317: Calling groups_inventory to load vars for managed_node3 8454 1726882432.16320: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882432.16333: Calling all_plugins_play to load vars for managed_node3 8454 1726882432.16339: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882432.16345: Calling groups_plugins_play to load vars for managed_node3 8454 1726882432.16951: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000073 8454 1726882432.16955: WORKER PROCESS EXITING 8454 1726882432.27661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882432.33345: done with get_vars() 8454 1726882432.33394: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:52 -0400 (0:00:00.688) 0:00:30.354 ****** 8454 1726882432.33709: entering _queue_task() for managed_node3/include_tasks 8454 1726882432.34473: worker is 1 (out of 1 available) 8454 1726882432.34484: exiting _queue_task() for managed_node3/include_tasks 8454 1726882432.34497: done queuing things up, now waiting for results queue to drain 8454 1726882432.34499: waiting for pending results... 8454 1726882432.34685: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8454 1726882432.34909: in run() - task 0affe814-3a2d-f59f-16b9-00000000007c 8454 1726882432.34939: variable 'ansible_search_path' from source: unknown 8454 1726882432.34953: variable 'ansible_search_path' from source: unknown 8454 1726882432.35005: calling self._execute() 8454 1726882432.35127: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882432.35148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882432.35172: variable 'omit' from source: magic vars 8454 1726882432.35649: variable 'ansible_distribution_major_version' from source: facts 8454 1726882432.35670: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882432.35690: _execute() done 8454 1726882432.35705: dumping result to json 8454 1726882432.35739: done dumping result, returning 8454 1726882432.35743: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-f59f-16b9-00000000007c] 8454 1726882432.35746: sending task result for task 0affe814-3a2d-f59f-16b9-00000000007c 8454 1726882432.36082: no more pending results, returning what we have 8454 1726882432.36089: in VariableManager get_vars() 8454 1726882432.36145: Calling all_inventory to load vars for managed_node3 8454 1726882432.36149: Calling groups_inventory to load vars for managed_node3 8454 1726882432.36152: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882432.36166: Calling all_plugins_play to load vars for managed_node3 8454 1726882432.36170: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882432.36174: Calling groups_plugins_play to load vars for managed_node3 8454 1726882432.36751: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000007c 8454 1726882432.36754: WORKER PROCESS EXITING 8454 1726882432.38506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882432.43576: done with get_vars() 8454 1726882432.43611: variable 'ansible_search_path' from source: unknown 8454 1726882432.43613: variable 'ansible_search_path' from source: unknown 8454 1726882432.43668: we have included files to process 8454 1726882432.43670: generating all_blocks data 8454 1726882432.43674: done generating all_blocks data 8454 1726882432.43680: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8454 1726882432.43682: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8454 1726882432.43685: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8454 1726882432.45248: done processing included file 8454 1726882432.45251: iterating over new_blocks loaded from include file 8454 1726882432.45253: in VariableManager get_vars() 8454 1726882432.45289: done with get_vars() 8454 1726882432.45291: filtering new block on tags 8454 1726882432.45331: done filtering new block on tags 8454 1726882432.45538: in VariableManager get_vars() 8454 1726882432.45572: done with get_vars() 8454 1726882432.45574: filtering new block on tags 8454 1726882432.45632: done filtering new block on tags 8454 1726882432.45638: in VariableManager get_vars() 8454 1726882432.45669: done with get_vars() 8454 1726882432.45671: filtering new block on tags 8454 1726882432.45727: done filtering new block on tags 8454 1726882432.45730: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 8454 1726882432.45940: extending task lists for all hosts with included blocks 8454 1726882432.48866: done extending task lists 8454 1726882432.48867: done processing included files 8454 1726882432.48868: results queue empty 8454 1726882432.48869: checking for any_errors_fatal 8454 1726882432.48875: done checking for any_errors_fatal 8454 1726882432.48876: checking for max_fail_percentage 8454 1726882432.48878: done checking for max_fail_percentage 8454 1726882432.48879: checking to see if all hosts have failed and the running result is not ok 8454 1726882432.48880: done checking to see if all hosts have failed 8454 1726882432.48881: getting the remaining hosts for this loop 8454 1726882432.48882: done getting the remaining hosts for this loop 8454 1726882432.48886: getting the next task for host managed_node3 8454 1726882432.48893: done getting next task for host managed_node3 8454 1726882432.48896: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8454 1726882432.48900: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882432.48912: getting variables 8454 1726882432.48914: in VariableManager get_vars() 8454 1726882432.48933: Calling all_inventory to load vars for managed_node3 8454 1726882432.49139: Calling groups_inventory to load vars for managed_node3 8454 1726882432.49143: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882432.49151: Calling all_plugins_play to load vars for managed_node3 8454 1726882432.49154: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882432.49158: Calling groups_plugins_play to load vars for managed_node3 8454 1726882432.52333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882432.55227: done with get_vars() 8454 1726882432.55264: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:52 -0400 (0:00:00.216) 0:00:30.571 ****** 8454 1726882432.55367: entering _queue_task() for managed_node3/setup 8454 1726882432.55879: worker is 1 (out of 1 available) 8454 1726882432.55892: exiting _queue_task() for managed_node3/setup 8454 1726882432.55905: done queuing things up, now waiting for results queue to drain 8454 1726882432.55907: waiting for pending results... 8454 1726882432.56129: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8454 1726882432.56367: in run() - task 0affe814-3a2d-f59f-16b9-000000000491 8454 1726882432.56388: variable 'ansible_search_path' from source: unknown 8454 1726882432.56395: variable 'ansible_search_path' from source: unknown 8454 1726882432.56446: calling self._execute() 8454 1726882432.56841: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882432.56846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882432.56850: variable 'omit' from source: magic vars 8454 1726882432.57697: variable 'ansible_distribution_major_version' from source: facts 8454 1726882432.57771: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882432.58292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882432.63540: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882432.63586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882432.63791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882432.63963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882432.63998: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882432.64096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882432.64131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882432.64359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882432.64418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882432.64438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882432.64548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882432.64576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882432.64726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882432.64777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882432.64798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882432.65394: variable '__network_required_facts' from source: role '' defaults 8454 1726882432.65405: variable 'ansible_facts' from source: unknown 8454 1726882432.68272: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8454 1726882432.68278: when evaluation is False, skipping this task 8454 1726882432.68281: _execute() done 8454 1726882432.68287: dumping result to json 8454 1726882432.68289: done dumping result, returning 8454 1726882432.68302: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-f59f-16b9-000000000491] 8454 1726882432.68307: sending task result for task 0affe814-3a2d-f59f-16b9-000000000491 8454 1726882432.68458: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000491 8454 1726882432.68462: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882432.68542: no more pending results, returning what we have 8454 1726882432.68546: results queue empty 8454 1726882432.68547: checking for any_errors_fatal 8454 1726882432.68549: done checking for any_errors_fatal 8454 1726882432.68550: checking for max_fail_percentage 8454 1726882432.68552: done checking for max_fail_percentage 8454 1726882432.68553: checking to see if all hosts have failed and the running result is not ok 8454 1726882432.68554: done checking to see if all hosts have failed 8454 1726882432.68555: getting the remaining hosts for this loop 8454 1726882432.68557: done getting the remaining hosts for this loop 8454 1726882432.68561: getting the next task for host managed_node3 8454 1726882432.68575: done getting next task for host managed_node3 8454 1726882432.68580: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 8454 1726882432.68587: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882432.68613: getting variables 8454 1726882432.68615: in VariableManager get_vars() 8454 1726882432.68918: Calling all_inventory to load vars for managed_node3 8454 1726882432.68922: Calling groups_inventory to load vars for managed_node3 8454 1726882432.68925: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882432.68938: Calling all_plugins_play to load vars for managed_node3 8454 1726882432.68942: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882432.68947: Calling groups_plugins_play to load vars for managed_node3 8454 1726882432.74184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882432.80389: done with get_vars() 8454 1726882432.80428: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:52 -0400 (0:00:00.254) 0:00:30.825 ****** 8454 1726882432.80823: entering _queue_task() for managed_node3/stat 8454 1726882432.81541: worker is 1 (out of 1 available) 8454 1726882432.81669: exiting _queue_task() for managed_node3/stat 8454 1726882432.81686: done queuing things up, now waiting for results queue to drain 8454 1726882432.81688: waiting for pending results... 8454 1726882432.82062: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 8454 1726882432.82438: in run() - task 0affe814-3a2d-f59f-16b9-000000000493 8454 1726882432.82484: variable 'ansible_search_path' from source: unknown 8454 1726882432.82489: variable 'ansible_search_path' from source: unknown 8454 1726882432.82493: calling self._execute() 8454 1726882432.82818: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882432.82826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882432.82869: variable 'omit' from source: magic vars 8454 1726882432.84086: variable 'ansible_distribution_major_version' from source: facts 8454 1726882432.84090: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882432.84427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882432.85082: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882432.85136: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882432.85286: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882432.85344: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882432.85662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882432.85693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882432.85842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882432.85874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882432.86095: variable '__network_is_ostree' from source: set_fact 8454 1726882432.86103: Evaluated conditional (not __network_is_ostree is defined): False 8454 1726882432.86106: when evaluation is False, skipping this task 8454 1726882432.86109: _execute() done 8454 1726882432.86114: dumping result to json 8454 1726882432.86119: done dumping result, returning 8454 1726882432.86129: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-f59f-16b9-000000000493] 8454 1726882432.86137: sending task result for task 0affe814-3a2d-f59f-16b9-000000000493 8454 1726882432.86359: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000493 8454 1726882432.86365: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8454 1726882432.86425: no more pending results, returning what we have 8454 1726882432.86430: results queue empty 8454 1726882432.86431: checking for any_errors_fatal 8454 1726882432.86442: done checking for any_errors_fatal 8454 1726882432.86443: checking for max_fail_percentage 8454 1726882432.86445: done checking for max_fail_percentage 8454 1726882432.86446: checking to see if all hosts have failed and the running result is not ok 8454 1726882432.86447: done checking to see if all hosts have failed 8454 1726882432.86448: getting the remaining hosts for this loop 8454 1726882432.86450: done getting the remaining hosts for this loop 8454 1726882432.86454: getting the next task for host managed_node3 8454 1726882432.86462: done getting next task for host managed_node3 8454 1726882432.86466: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8454 1726882432.86472: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882432.86494: getting variables 8454 1726882432.86496: in VariableManager get_vars() 8454 1726882432.86741: Calling all_inventory to load vars for managed_node3 8454 1726882432.86745: Calling groups_inventory to load vars for managed_node3 8454 1726882432.86749: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882432.86759: Calling all_plugins_play to load vars for managed_node3 8454 1726882432.86763: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882432.86767: Calling groups_plugins_play to load vars for managed_node3 8454 1726882432.92256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882432.95291: done with get_vars() 8454 1726882432.95328: done getting variables 8454 1726882432.95402: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:52 -0400 (0:00:00.146) 0:00:30.972 ****** 8454 1726882432.95455: entering _queue_task() for managed_node3/set_fact 8454 1726882432.95871: worker is 1 (out of 1 available) 8454 1726882432.95888: exiting _queue_task() for managed_node3/set_fact 8454 1726882432.95904: done queuing things up, now waiting for results queue to drain 8454 1726882432.95906: waiting for pending results... 8454 1726882432.96554: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8454 1726882432.96560: in run() - task 0affe814-3a2d-f59f-16b9-000000000494 8454 1726882432.96564: variable 'ansible_search_path' from source: unknown 8454 1726882432.96566: variable 'ansible_search_path' from source: unknown 8454 1726882432.96569: calling self._execute() 8454 1726882432.96571: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882432.96574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882432.96580: variable 'omit' from source: magic vars 8454 1726882432.96995: variable 'ansible_distribution_major_version' from source: facts 8454 1726882432.97008: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882432.97224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882432.97740: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882432.97745: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882432.97749: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882432.97753: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882432.98339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882432.98343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882432.98346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882432.98349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882432.98369: variable '__network_is_ostree' from source: set_fact 8454 1726882432.98380: Evaluated conditional (not __network_is_ostree is defined): False 8454 1726882432.98383: when evaluation is False, skipping this task 8454 1726882432.98386: _execute() done 8454 1726882432.98389: dumping result to json 8454 1726882432.98399: done dumping result, returning 8454 1726882432.98412: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-f59f-16b9-000000000494] 8454 1726882432.98420: sending task result for task 0affe814-3a2d-f59f-16b9-000000000494 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8454 1726882432.98575: no more pending results, returning what we have 8454 1726882432.98579: results queue empty 8454 1726882432.98581: checking for any_errors_fatal 8454 1726882432.98587: done checking for any_errors_fatal 8454 1726882432.98588: checking for max_fail_percentage 8454 1726882432.98591: done checking for max_fail_percentage 8454 1726882432.98592: checking to see if all hosts have failed and the running result is not ok 8454 1726882432.98593: done checking to see if all hosts have failed 8454 1726882432.98594: getting the remaining hosts for this loop 8454 1726882432.98596: done getting the remaining hosts for this loop 8454 1726882432.98601: getting the next task for host managed_node3 8454 1726882432.98615: done getting next task for host managed_node3 8454 1726882432.98620: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 8454 1726882432.98626: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882432.98650: getting variables 8454 1726882432.98652: in VariableManager get_vars() 8454 1726882432.98698: Calling all_inventory to load vars for managed_node3 8454 1726882432.98701: Calling groups_inventory to load vars for managed_node3 8454 1726882432.98704: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882432.98716: Calling all_plugins_play to load vars for managed_node3 8454 1726882432.98720: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882432.98724: Calling groups_plugins_play to load vars for managed_node3 8454 1726882432.99349: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000494 8454 1726882432.99353: WORKER PROCESS EXITING 8454 1726882433.03169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882433.09025: done with get_vars() 8454 1726882433.09069: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:53 -0400 (0:00:00.139) 0:00:31.111 ****** 8454 1726882433.09402: entering _queue_task() for managed_node3/service_facts 8454 1726882433.10078: worker is 1 (out of 1 available) 8454 1726882433.10093: exiting _queue_task() for managed_node3/service_facts 8454 1726882433.10109: done queuing things up, now waiting for results queue to drain 8454 1726882433.10110: waiting for pending results... 8454 1726882433.10350: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 8454 1726882433.10645: in run() - task 0affe814-3a2d-f59f-16b9-000000000496 8454 1726882433.10649: variable 'ansible_search_path' from source: unknown 8454 1726882433.10652: variable 'ansible_search_path' from source: unknown 8454 1726882433.10656: calling self._execute() 8454 1726882433.10761: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882433.10768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882433.10782: variable 'omit' from source: magic vars 8454 1726882433.11282: variable 'ansible_distribution_major_version' from source: facts 8454 1726882433.11293: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882433.11299: variable 'omit' from source: magic vars 8454 1726882433.11418: variable 'omit' from source: magic vars 8454 1726882433.11469: variable 'omit' from source: magic vars 8454 1726882433.11514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882433.11558: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882433.11590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882433.11612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882433.11626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882433.11663: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882433.11671: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882433.11682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882433.11811: Set connection var ansible_connection to ssh 8454 1726882433.11823: Set connection var ansible_shell_executable to /bin/sh 8454 1726882433.11831: Set connection var ansible_timeout to 10 8454 1726882433.11835: Set connection var ansible_shell_type to sh 8454 1726882433.11848: Set connection var ansible_pipelining to False 8454 1726882433.11856: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882433.11883: variable 'ansible_shell_executable' from source: unknown 8454 1726882433.11891: variable 'ansible_connection' from source: unknown 8454 1726882433.11901: variable 'ansible_module_compression' from source: unknown 8454 1726882433.11904: variable 'ansible_shell_type' from source: unknown 8454 1726882433.11908: variable 'ansible_shell_executable' from source: unknown 8454 1726882433.12040: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882433.12044: variable 'ansible_pipelining' from source: unknown 8454 1726882433.12047: variable 'ansible_timeout' from source: unknown 8454 1726882433.12050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882433.12342: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882433.12347: variable 'omit' from source: magic vars 8454 1726882433.12350: starting attempt loop 8454 1726882433.12352: running the handler 8454 1726882433.12355: _low_level_execute_command(): starting 8454 1726882433.12357: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882433.13169: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882433.13194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882433.13210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882433.13280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882433.13526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882433.15306: stdout chunk (state=3): >>>/root <<< 8454 1726882433.15473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882433.15481: stdout chunk (state=3): >>><<< 8454 1726882433.15624: stderr chunk (state=3): >>><<< 8454 1726882433.15629: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882433.15642: _low_level_execute_command(): starting 8454 1726882433.15650: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077 `" && echo ansible-tmp-1726882433.1562595-9598-148825663210077="` echo /root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077 `" ) && sleep 0' 8454 1726882433.16307: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882433.16320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882433.16341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882433.16390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882433.16502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882433.16506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882433.16541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882433.16654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882433.18746: stdout chunk (state=3): >>>ansible-tmp-1726882433.1562595-9598-148825663210077=/root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077 <<< 8454 1726882433.18929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882433.18932: stdout chunk (state=3): >>><<< 8454 1726882433.18937: stderr chunk (state=3): >>><<< 8454 1726882433.19139: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882433.1562595-9598-148825663210077=/root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882433.19143: variable 'ansible_module_compression' from source: unknown 8454 1726882433.19146: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 8454 1726882433.19148: variable 'ansible_facts' from source: unknown 8454 1726882433.19199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/AnsiballZ_service_facts.py 8454 1726882433.19393: Sending initial data 8454 1726882433.19404: Sent initial data (160 bytes) 8454 1726882433.20076: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882433.20097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882433.20148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882433.20230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882433.20270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882433.20297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882433.20451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882433.22128: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882433.22273: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882433.22407: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpmo7rt9nm /root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/AnsiballZ_service_facts.py <<< 8454 1726882433.22410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/AnsiballZ_service_facts.py" <<< 8454 1726882433.22562: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpmo7rt9nm" to remote "/root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/AnsiballZ_service_facts.py" <<< 8454 1726882433.24249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882433.24252: stderr chunk (state=3): >>><<< 8454 1726882433.24255: stdout chunk (state=3): >>><<< 8454 1726882433.24262: done transferring module to remote 8454 1726882433.24278: _low_level_execute_command(): starting 8454 1726882433.24287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/ /root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/AnsiballZ_service_facts.py && sleep 0' 8454 1726882433.25360: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882433.25363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882433.25365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882433.25368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882433.25371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882433.25374: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882433.25391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882433.25395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882433.25422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882433.25556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882433.27948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882433.27952: stdout chunk (state=3): >>><<< 8454 1726882433.27959: stderr chunk (state=3): >>><<< 8454 1726882433.28041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882433.28045: _low_level_execute_command(): starting 8454 1726882433.28048: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/AnsiballZ_service_facts.py && sleep 0' 8454 1726882433.28655: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882433.28669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882433.28759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882433.28830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882433.29000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882435.20103: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.<<< 8454 1726882435.20130: stdout chunk (state=3): >>>service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service",<<< 8454 1726882435.20143: stdout chunk (state=3): >>> "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 8454 1726882435.21803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882435.21862: stderr chunk (state=3): >>><<< 8454 1726882435.21865: stdout chunk (state=3): >>><<< 8454 1726882435.21893: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882435.22621: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882435.22639: _low_level_execute_command(): starting 8454 1726882435.22646: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882433.1562595-9598-148825663210077/ > /dev/null 2>&1 && sleep 0' 8454 1726882435.23128: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882435.23132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882435.23136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.23139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882435.23142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.23186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882435.23190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882435.23311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882435.25335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882435.25383: stderr chunk (state=3): >>><<< 8454 1726882435.25387: stdout chunk (state=3): >>><<< 8454 1726882435.25398: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882435.25405: handler run complete 8454 1726882435.25565: variable 'ansible_facts' from source: unknown 8454 1726882435.25709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882435.26248: variable 'ansible_facts' from source: unknown 8454 1726882435.26369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882435.26565: attempt loop complete, returning result 8454 1726882435.26571: _execute() done 8454 1726882435.26576: dumping result to json 8454 1726882435.26626: done dumping result, returning 8454 1726882435.26636: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-f59f-16b9-000000000496] 8454 1726882435.26641: sending task result for task 0affe814-3a2d-f59f-16b9-000000000496 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882435.27499: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000496 8454 1726882435.27502: WORKER PROCESS EXITING 8454 1726882435.27511: no more pending results, returning what we have 8454 1726882435.27514: results queue empty 8454 1726882435.27514: checking for any_errors_fatal 8454 1726882435.27518: done checking for any_errors_fatal 8454 1726882435.27519: checking for max_fail_percentage 8454 1726882435.27520: done checking for max_fail_percentage 8454 1726882435.27521: checking to see if all hosts have failed and the running result is not ok 8454 1726882435.27522: done checking to see if all hosts have failed 8454 1726882435.27522: getting the remaining hosts for this loop 8454 1726882435.27523: done getting the remaining hosts for this loop 8454 1726882435.27526: getting the next task for host managed_node3 8454 1726882435.27531: done getting next task for host managed_node3 8454 1726882435.27536: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 8454 1726882435.27540: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882435.27548: getting variables 8454 1726882435.27549: in VariableManager get_vars() 8454 1726882435.27580: Calling all_inventory to load vars for managed_node3 8454 1726882435.27582: Calling groups_inventory to load vars for managed_node3 8454 1726882435.27584: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882435.27592: Calling all_plugins_play to load vars for managed_node3 8454 1726882435.27594: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882435.27596: Calling groups_plugins_play to load vars for managed_node3 8454 1726882435.28737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882435.30307: done with get_vars() 8454 1726882435.30328: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:55 -0400 (0:00:02.210) 0:00:33.321 ****** 8454 1726882435.30413: entering _queue_task() for managed_node3/package_facts 8454 1726882435.30644: worker is 1 (out of 1 available) 8454 1726882435.30658: exiting _queue_task() for managed_node3/package_facts 8454 1726882435.30672: done queuing things up, now waiting for results queue to drain 8454 1726882435.30673: waiting for pending results... 8454 1726882435.30863: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 8454 1726882435.30998: in run() - task 0affe814-3a2d-f59f-16b9-000000000497 8454 1726882435.31012: variable 'ansible_search_path' from source: unknown 8454 1726882435.31020: variable 'ansible_search_path' from source: unknown 8454 1726882435.31057: calling self._execute() 8454 1726882435.31135: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882435.31143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882435.31155: variable 'omit' from source: magic vars 8454 1726882435.31479: variable 'ansible_distribution_major_version' from source: facts 8454 1726882435.31487: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882435.31492: variable 'omit' from source: magic vars 8454 1726882435.31639: variable 'omit' from source: magic vars 8454 1726882435.31644: variable 'omit' from source: magic vars 8454 1726882435.31687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882435.31732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882435.31763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882435.31789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882435.31807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882435.31849: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882435.31858: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882435.31867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882435.32039: Set connection var ansible_connection to ssh 8454 1726882435.32043: Set connection var ansible_shell_executable to /bin/sh 8454 1726882435.32045: Set connection var ansible_timeout to 10 8454 1726882435.32047: Set connection var ansible_shell_type to sh 8454 1726882435.32049: Set connection var ansible_pipelining to False 8454 1726882435.32052: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882435.32085: variable 'ansible_shell_executable' from source: unknown 8454 1726882435.32095: variable 'ansible_connection' from source: unknown 8454 1726882435.32104: variable 'ansible_module_compression' from source: unknown 8454 1726882435.32111: variable 'ansible_shell_type' from source: unknown 8454 1726882435.32118: variable 'ansible_shell_executable' from source: unknown 8454 1726882435.32125: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882435.32133: variable 'ansible_pipelining' from source: unknown 8454 1726882435.32241: variable 'ansible_timeout' from source: unknown 8454 1726882435.32245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882435.32386: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882435.32406: variable 'omit' from source: magic vars 8454 1726882435.32416: starting attempt loop 8454 1726882435.32423: running the handler 8454 1726882435.32445: _low_level_execute_command(): starting 8454 1726882435.32458: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882435.33057: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882435.33085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.33134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882435.33159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882435.33274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882435.35121: stdout chunk (state=3): >>>/root <<< 8454 1726882435.35311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882435.35315: stdout chunk (state=3): >>><<< 8454 1726882435.35318: stderr chunk (state=3): >>><<< 8454 1726882435.35436: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882435.35441: _low_level_execute_command(): starting 8454 1726882435.35444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248 `" && echo ansible-tmp-1726882435.353414-9682-161685566641248="` echo /root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248 `" ) && sleep 0' 8454 1726882435.36059: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882435.36063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.36112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882435.36130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882435.36151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882435.36304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882435.38413: stdout chunk (state=3): >>>ansible-tmp-1726882435.353414-9682-161685566641248=/root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248 <<< 8454 1726882435.38550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882435.38574: stderr chunk (state=3): >>><<< 8454 1726882435.38577: stdout chunk (state=3): >>><<< 8454 1726882435.38598: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882435.353414-9682-161685566641248=/root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882435.38664: variable 'ansible_module_compression' from source: unknown 8454 1726882435.38689: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 8454 1726882435.38939: variable 'ansible_facts' from source: unknown 8454 1726882435.38972: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/AnsiballZ_package_facts.py 8454 1726882435.39224: Sending initial data 8454 1726882435.39237: Sent initial data (159 bytes) 8454 1726882435.39731: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882435.39747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882435.39794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.39842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882435.39857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882435.39975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882435.41700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8454 1726882435.41710: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882435.41812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882435.41938: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp2g6sevi5 /root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/AnsiballZ_package_facts.py <<< 8454 1726882435.41942: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/AnsiballZ_package_facts.py" <<< 8454 1726882435.42054: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp2g6sevi5" to remote "/root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/AnsiballZ_package_facts.py" <<< 8454 1726882435.44473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882435.44528: stderr chunk (state=3): >>><<< 8454 1726882435.44532: stdout chunk (state=3): >>><<< 8454 1726882435.44553: done transferring module to remote 8454 1726882435.44566: _low_level_execute_command(): starting 8454 1726882435.44572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/ /root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/AnsiballZ_package_facts.py && sleep 0' 8454 1726882435.45156: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.45172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882435.45242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882435.45256: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.45292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882435.45308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882435.45328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882435.45480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882435.47477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882435.47524: stderr chunk (state=3): >>><<< 8454 1726882435.47527: stdout chunk (state=3): >>><<< 8454 1726882435.47542: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882435.47546: _low_level_execute_command(): starting 8454 1726882435.47551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/AnsiballZ_package_facts.py && sleep 0' 8454 1726882435.47978: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882435.47982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.47985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882435.47987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882435.47992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882435.48032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882435.48052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882435.48169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882436.13075: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 8454 1726882436.13097: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 8454 1726882436.13125: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 8454 1726882436.13161: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 8454 1726882436.13166: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 8454 1726882436.13180: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "<<< 8454 1726882436.13203: stdout chunk (state=3): >>>version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null<<< 8454 1726882436.13223: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 8454 1726882436.13245: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "per<<< 8454 1726882436.13260: stdout chunk (state=3): >>>l-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "a<<< 8454 1726882436.13278: stdout chunk (state=3): >>>spell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 8454 1726882436.13305: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source"<<< 8454 1726882436.13331: stdout chunk (state=3): >>>: "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces":<<< 8454 1726882436.13345: stdout chunk (state=3): >>> [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 8454 1726882436.15353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882436.15383: stderr chunk (state=3): >>><<< 8454 1726882436.15387: stdout chunk (state=3): >>><<< 8454 1726882436.15445: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882436.18175: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882436.18201: _low_level_execute_command(): starting 8454 1726882436.18206: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882435.353414-9682-161685566641248/ > /dev/null 2>&1 && sleep 0' 8454 1726882436.18840: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882436.18845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882436.18858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882436.19001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882436.21176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882436.21189: stdout chunk (state=3): >>><<< 8454 1726882436.21212: stderr chunk (state=3): >>><<< 8454 1726882436.21441: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882436.21445: handler run complete 8454 1726882436.22180: variable 'ansible_facts' from source: unknown 8454 1726882436.22682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.25238: variable 'ansible_facts' from source: unknown 8454 1726882436.25666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.26481: attempt loop complete, returning result 8454 1726882436.26499: _execute() done 8454 1726882436.26502: dumping result to json 8454 1726882436.26684: done dumping result, returning 8454 1726882436.26693: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-f59f-16b9-000000000497] 8454 1726882436.26699: sending task result for task 0affe814-3a2d-f59f-16b9-000000000497 8454 1726882436.30524: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000497 8454 1726882436.30528: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882436.30708: no more pending results, returning what we have 8454 1726882436.30712: results queue empty 8454 1726882436.30713: checking for any_errors_fatal 8454 1726882436.30719: done checking for any_errors_fatal 8454 1726882436.30720: checking for max_fail_percentage 8454 1726882436.30722: done checking for max_fail_percentage 8454 1726882436.30723: checking to see if all hosts have failed and the running result is not ok 8454 1726882436.30723: done checking to see if all hosts have failed 8454 1726882436.30724: getting the remaining hosts for this loop 8454 1726882436.30726: done getting the remaining hosts for this loop 8454 1726882436.30730: getting the next task for host managed_node3 8454 1726882436.30741: done getting next task for host managed_node3 8454 1726882436.30746: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 8454 1726882436.30751: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882436.30766: getting variables 8454 1726882436.30767: in VariableManager get_vars() 8454 1726882436.30818: Calling all_inventory to load vars for managed_node3 8454 1726882436.30822: Calling groups_inventory to load vars for managed_node3 8454 1726882436.30825: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882436.30837: Calling all_plugins_play to load vars for managed_node3 8454 1726882436.30841: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882436.30845: Calling groups_plugins_play to load vars for managed_node3 8454 1726882436.32918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.35843: done with get_vars() 8454 1726882436.35888: done getting variables 8454 1726882436.35963: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:56 -0400 (0:00:01.055) 0:00:34.377 ****** 8454 1726882436.36011: entering _queue_task() for managed_node3/debug 8454 1726882436.36559: worker is 1 (out of 1 available) 8454 1726882436.36571: exiting _queue_task() for managed_node3/debug 8454 1726882436.36587: done queuing things up, now waiting for results queue to drain 8454 1726882436.36589: waiting for pending results... 8454 1726882436.36783: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 8454 1726882436.37039: in run() - task 0affe814-3a2d-f59f-16b9-00000000007d 8454 1726882436.37044: variable 'ansible_search_path' from source: unknown 8454 1726882436.37048: variable 'ansible_search_path' from source: unknown 8454 1726882436.37069: calling self._execute() 8454 1726882436.37187: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.37256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.37261: variable 'omit' from source: magic vars 8454 1726882436.37740: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.37761: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882436.37773: variable 'omit' from source: magic vars 8454 1726882436.37871: variable 'omit' from source: magic vars 8454 1726882436.38008: variable 'network_provider' from source: set_fact 8454 1726882436.38050: variable 'omit' from source: magic vars 8454 1726882436.38130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882436.38167: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882436.38199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882436.38225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882436.38439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882436.38442: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882436.38445: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.38448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.38451: Set connection var ansible_connection to ssh 8454 1726882436.38453: Set connection var ansible_shell_executable to /bin/sh 8454 1726882436.38464: Set connection var ansible_timeout to 10 8454 1726882436.38473: Set connection var ansible_shell_type to sh 8454 1726882436.38494: Set connection var ansible_pipelining to False 8454 1726882436.38507: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882436.38541: variable 'ansible_shell_executable' from source: unknown 8454 1726882436.38552: variable 'ansible_connection' from source: unknown 8454 1726882436.38564: variable 'ansible_module_compression' from source: unknown 8454 1726882436.38576: variable 'ansible_shell_type' from source: unknown 8454 1726882436.38588: variable 'ansible_shell_executable' from source: unknown 8454 1726882436.38597: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.38607: variable 'ansible_pipelining' from source: unknown 8454 1726882436.38615: variable 'ansible_timeout' from source: unknown 8454 1726882436.38626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.38796: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882436.38807: variable 'omit' from source: magic vars 8454 1726882436.38813: starting attempt loop 8454 1726882436.38816: running the handler 8454 1726882436.38859: handler run complete 8454 1726882436.38873: attempt loop complete, returning result 8454 1726882436.38876: _execute() done 8454 1726882436.38882: dumping result to json 8454 1726882436.38887: done dumping result, returning 8454 1726882436.38898: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-f59f-16b9-00000000007d] 8454 1726882436.38911: sending task result for task 0affe814-3a2d-f59f-16b9-00000000007d ok: [managed_node3] => {} MSG: Using network provider: nm 8454 1726882436.39100: no more pending results, returning what we have 8454 1726882436.39104: results queue empty 8454 1726882436.39105: checking for any_errors_fatal 8454 1726882436.39116: done checking for any_errors_fatal 8454 1726882436.39117: checking for max_fail_percentage 8454 1726882436.39119: done checking for max_fail_percentage 8454 1726882436.39121: checking to see if all hosts have failed and the running result is not ok 8454 1726882436.39122: done checking to see if all hosts have failed 8454 1726882436.39122: getting the remaining hosts for this loop 8454 1726882436.39124: done getting the remaining hosts for this loop 8454 1726882436.39129: getting the next task for host managed_node3 8454 1726882436.39139: done getting next task for host managed_node3 8454 1726882436.39144: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8454 1726882436.39148: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882436.39160: getting variables 8454 1726882436.39162: in VariableManager get_vars() 8454 1726882436.39200: Calling all_inventory to load vars for managed_node3 8454 1726882436.39203: Calling groups_inventory to load vars for managed_node3 8454 1726882436.39206: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882436.39215: Calling all_plugins_play to load vars for managed_node3 8454 1726882436.39218: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882436.39276: Calling groups_plugins_play to load vars for managed_node3 8454 1726882436.39291: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000007d 8454 1726882436.39294: WORKER PROCESS EXITING 8454 1726882436.40950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.43671: done with get_vars() 8454 1726882436.43697: done getting variables 8454 1726882436.43757: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:56 -0400 (0:00:00.077) 0:00:34.455 ****** 8454 1726882436.43788: entering _queue_task() for managed_node3/fail 8454 1726882436.44040: worker is 1 (out of 1 available) 8454 1726882436.44057: exiting _queue_task() for managed_node3/fail 8454 1726882436.44070: done queuing things up, now waiting for results queue to drain 8454 1726882436.44072: waiting for pending results... 8454 1726882436.44265: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8454 1726882436.44387: in run() - task 0affe814-3a2d-f59f-16b9-00000000007e 8454 1726882436.44399: variable 'ansible_search_path' from source: unknown 8454 1726882436.44405: variable 'ansible_search_path' from source: unknown 8454 1726882436.44438: calling self._execute() 8454 1726882436.44510: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.44518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.44528: variable 'omit' from source: magic vars 8454 1726882436.44850: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.44861: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882436.44970: variable 'network_state' from source: role '' defaults 8454 1726882436.44977: Evaluated conditional (network_state != {}): False 8454 1726882436.44984: when evaluation is False, skipping this task 8454 1726882436.44987: _execute() done 8454 1726882436.44992: dumping result to json 8454 1726882436.44996: done dumping result, returning 8454 1726882436.45004: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-f59f-16b9-00000000007e] 8454 1726882436.45010: sending task result for task 0affe814-3a2d-f59f-16b9-00000000007e 8454 1726882436.45105: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000007e 8454 1726882436.45108: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882436.45165: no more pending results, returning what we have 8454 1726882436.45170: results queue empty 8454 1726882436.45171: checking for any_errors_fatal 8454 1726882436.45179: done checking for any_errors_fatal 8454 1726882436.45180: checking for max_fail_percentage 8454 1726882436.45182: done checking for max_fail_percentage 8454 1726882436.45183: checking to see if all hosts have failed and the running result is not ok 8454 1726882436.45184: done checking to see if all hosts have failed 8454 1726882436.45185: getting the remaining hosts for this loop 8454 1726882436.45187: done getting the remaining hosts for this loop 8454 1726882436.45191: getting the next task for host managed_node3 8454 1726882436.45199: done getting next task for host managed_node3 8454 1726882436.45203: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8454 1726882436.45208: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882436.45226: getting variables 8454 1726882436.45228: in VariableManager get_vars() 8454 1726882436.45273: Calling all_inventory to load vars for managed_node3 8454 1726882436.45276: Calling groups_inventory to load vars for managed_node3 8454 1726882436.45279: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882436.45290: Calling all_plugins_play to load vars for managed_node3 8454 1726882436.45293: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882436.45296: Calling groups_plugins_play to load vars for managed_node3 8454 1726882436.47091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.49893: done with get_vars() 8454 1726882436.49928: done getting variables 8454 1726882436.50001: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:56 -0400 (0:00:00.062) 0:00:34.518 ****** 8454 1726882436.50044: entering _queue_task() for managed_node3/fail 8454 1726882436.50401: worker is 1 (out of 1 available) 8454 1726882436.50414: exiting _queue_task() for managed_node3/fail 8454 1726882436.50434: done queuing things up, now waiting for results queue to drain 8454 1726882436.50438: waiting for pending results... 8454 1726882436.50707: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8454 1726882436.50832: in run() - task 0affe814-3a2d-f59f-16b9-00000000007f 8454 1726882436.50845: variable 'ansible_search_path' from source: unknown 8454 1726882436.50849: variable 'ansible_search_path' from source: unknown 8454 1726882436.50884: calling self._execute() 8454 1726882436.50961: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.50967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.50992: variable 'omit' from source: magic vars 8454 1726882436.51290: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.51304: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882436.51405: variable 'network_state' from source: role '' defaults 8454 1726882436.51415: Evaluated conditional (network_state != {}): False 8454 1726882436.51420: when evaluation is False, skipping this task 8454 1726882436.51423: _execute() done 8454 1726882436.51426: dumping result to json 8454 1726882436.51433: done dumping result, returning 8454 1726882436.51441: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-f59f-16b9-00000000007f] 8454 1726882436.51447: sending task result for task 0affe814-3a2d-f59f-16b9-00000000007f 8454 1726882436.51542: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000007f 8454 1726882436.51546: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882436.51600: no more pending results, returning what we have 8454 1726882436.51604: results queue empty 8454 1726882436.51605: checking for any_errors_fatal 8454 1726882436.51612: done checking for any_errors_fatal 8454 1726882436.51613: checking for max_fail_percentage 8454 1726882436.51615: done checking for max_fail_percentage 8454 1726882436.51616: checking to see if all hosts have failed and the running result is not ok 8454 1726882436.51617: done checking to see if all hosts have failed 8454 1726882436.51618: getting the remaining hosts for this loop 8454 1726882436.51620: done getting the remaining hosts for this loop 8454 1726882436.51623: getting the next task for host managed_node3 8454 1726882436.51630: done getting next task for host managed_node3 8454 1726882436.51635: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8454 1726882436.51640: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882436.51661: getting variables 8454 1726882436.51663: in VariableManager get_vars() 8454 1726882436.51701: Calling all_inventory to load vars for managed_node3 8454 1726882436.51704: Calling groups_inventory to load vars for managed_node3 8454 1726882436.51706: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882436.51716: Calling all_plugins_play to load vars for managed_node3 8454 1726882436.51719: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882436.51723: Calling groups_plugins_play to load vars for managed_node3 8454 1726882436.53360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.55411: done with get_vars() 8454 1726882436.55433: done getting variables 8454 1726882436.55486: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:56 -0400 (0:00:00.054) 0:00:34.572 ****** 8454 1726882436.55514: entering _queue_task() for managed_node3/fail 8454 1726882436.55739: worker is 1 (out of 1 available) 8454 1726882436.55755: exiting _queue_task() for managed_node3/fail 8454 1726882436.55768: done queuing things up, now waiting for results queue to drain 8454 1726882436.55770: waiting for pending results... 8454 1726882436.55958: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8454 1726882436.56072: in run() - task 0affe814-3a2d-f59f-16b9-000000000080 8454 1726882436.56085: variable 'ansible_search_path' from source: unknown 8454 1726882436.56089: variable 'ansible_search_path' from source: unknown 8454 1726882436.56123: calling self._execute() 8454 1726882436.56198: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.56203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.56216: variable 'omit' from source: magic vars 8454 1726882436.56740: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.56744: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882436.56884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882436.59004: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882436.59066: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882436.59101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882436.59130: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882436.59154: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882436.59224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.59249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.59270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.59309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.59323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.59402: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.59413: Evaluated conditional (ansible_distribution_major_version | int > 9): True 8454 1726882436.59512: variable 'ansible_distribution' from source: facts 8454 1726882436.59515: variable '__network_rh_distros' from source: role '' defaults 8454 1726882436.59523: Evaluated conditional (ansible_distribution in __network_rh_distros): False 8454 1726882436.59526: when evaluation is False, skipping this task 8454 1726882436.59533: _execute() done 8454 1726882436.59536: dumping result to json 8454 1726882436.59542: done dumping result, returning 8454 1726882436.59550: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-f59f-16b9-000000000080] 8454 1726882436.59555: sending task result for task 0affe814-3a2d-f59f-16b9-000000000080 8454 1726882436.59649: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000080 8454 1726882436.59651: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 8454 1726882436.59706: no more pending results, returning what we have 8454 1726882436.59711: results queue empty 8454 1726882436.59712: checking for any_errors_fatal 8454 1726882436.59720: done checking for any_errors_fatal 8454 1726882436.59721: checking for max_fail_percentage 8454 1726882436.59723: done checking for max_fail_percentage 8454 1726882436.59724: checking to see if all hosts have failed and the running result is not ok 8454 1726882436.59725: done checking to see if all hosts have failed 8454 1726882436.59726: getting the remaining hosts for this loop 8454 1726882436.59728: done getting the remaining hosts for this loop 8454 1726882436.59732: getting the next task for host managed_node3 8454 1726882436.59743: done getting next task for host managed_node3 8454 1726882436.59747: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8454 1726882436.59753: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882436.59776: getting variables 8454 1726882436.59778: in VariableManager get_vars() 8454 1726882436.59818: Calling all_inventory to load vars for managed_node3 8454 1726882436.59821: Calling groups_inventory to load vars for managed_node3 8454 1726882436.59824: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882436.59833: Calling all_plugins_play to load vars for managed_node3 8454 1726882436.59837: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882436.59841: Calling groups_plugins_play to load vars for managed_node3 8454 1726882436.61118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.62636: done with get_vars() 8454 1726882436.62656: done getting variables 8454 1726882436.62704: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:56 -0400 (0:00:00.072) 0:00:34.644 ****** 8454 1726882436.62729: entering _queue_task() for managed_node3/dnf 8454 1726882436.62946: worker is 1 (out of 1 available) 8454 1726882436.62961: exiting _queue_task() for managed_node3/dnf 8454 1726882436.62974: done queuing things up, now waiting for results queue to drain 8454 1726882436.62976: waiting for pending results... 8454 1726882436.63163: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8454 1726882436.63280: in run() - task 0affe814-3a2d-f59f-16b9-000000000081 8454 1726882436.63292: variable 'ansible_search_path' from source: unknown 8454 1726882436.63296: variable 'ansible_search_path' from source: unknown 8454 1726882436.63329: calling self._execute() 8454 1726882436.63402: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.63408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.63425: variable 'omit' from source: magic vars 8454 1726882436.63722: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.63733: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882436.63907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882436.65599: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882436.65652: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882436.65720: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882436.65723: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882436.65740: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882436.65806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.65835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.65858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.65891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.65906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.66005: variable 'ansible_distribution' from source: facts 8454 1726882436.66008: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.66015: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 8454 1726882436.66109: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882436.66220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.66241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.66267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.66301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.66313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.66351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.66372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.66394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.66425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.66438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.66474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.66497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.66517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.66550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.66562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.66691: variable 'network_connections' from source: task vars 8454 1726882436.66702: variable 'port2_profile' from source: play vars 8454 1726882436.66759: variable 'port2_profile' from source: play vars 8454 1726882436.66768: variable 'port1_profile' from source: play vars 8454 1726882436.66820: variable 'port1_profile' from source: play vars 8454 1726882436.66828: variable 'controller_profile' from source: play vars 8454 1726882436.66889: variable 'controller_profile' from source: play vars 8454 1726882436.66950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882436.67091: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882436.67122: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882436.67154: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882436.67182: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882436.67216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882436.67239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882436.67264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.67287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882436.67330: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882436.67530: variable 'network_connections' from source: task vars 8454 1726882436.67533: variable 'port2_profile' from source: play vars 8454 1726882436.67586: variable 'port2_profile' from source: play vars 8454 1726882436.67595: variable 'port1_profile' from source: play vars 8454 1726882436.67684: variable 'port1_profile' from source: play vars 8454 1726882436.67689: variable 'controller_profile' from source: play vars 8454 1726882436.67716: variable 'controller_profile' from source: play vars 8454 1726882436.67752: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8454 1726882436.67759: when evaluation is False, skipping this task 8454 1726882436.67763: _execute() done 8454 1726882436.67768: dumping result to json 8454 1726882436.67773: done dumping result, returning 8454 1726882436.67787: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-f59f-16b9-000000000081] 8454 1726882436.67793: sending task result for task 0affe814-3a2d-f59f-16b9-000000000081 8454 1726882436.67905: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000081 8454 1726882436.67910: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8454 1726882436.67970: no more pending results, returning what we have 8454 1726882436.67974: results queue empty 8454 1726882436.67975: checking for any_errors_fatal 8454 1726882436.67983: done checking for any_errors_fatal 8454 1726882436.67984: checking for max_fail_percentage 8454 1726882436.67986: done checking for max_fail_percentage 8454 1726882436.67987: checking to see if all hosts have failed and the running result is not ok 8454 1726882436.67988: done checking to see if all hosts have failed 8454 1726882436.67989: getting the remaining hosts for this loop 8454 1726882436.67991: done getting the remaining hosts for this loop 8454 1726882436.67995: getting the next task for host managed_node3 8454 1726882436.68002: done getting next task for host managed_node3 8454 1726882436.68007: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8454 1726882436.68012: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882436.68033: getting variables 8454 1726882436.68036: in VariableManager get_vars() 8454 1726882436.68074: Calling all_inventory to load vars for managed_node3 8454 1726882436.68077: Calling groups_inventory to load vars for managed_node3 8454 1726882436.68080: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882436.68091: Calling all_plugins_play to load vars for managed_node3 8454 1726882436.68094: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882436.68097: Calling groups_plugins_play to load vars for managed_node3 8454 1726882436.69355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.70872: done with get_vars() 8454 1726882436.70899: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8454 1726882436.70964: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:56 -0400 (0:00:00.082) 0:00:34.727 ****** 8454 1726882436.70993: entering _queue_task() for managed_node3/yum 8454 1726882436.71252: worker is 1 (out of 1 available) 8454 1726882436.71268: exiting _queue_task() for managed_node3/yum 8454 1726882436.71282: done queuing things up, now waiting for results queue to drain 8454 1726882436.71284: waiting for pending results... 8454 1726882436.71482: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8454 1726882436.71596: in run() - task 0affe814-3a2d-f59f-16b9-000000000082 8454 1726882436.71609: variable 'ansible_search_path' from source: unknown 8454 1726882436.71613: variable 'ansible_search_path' from source: unknown 8454 1726882436.71648: calling self._execute() 8454 1726882436.71720: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.71728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.71741: variable 'omit' from source: magic vars 8454 1726882436.72059: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.72069: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882436.72223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882436.73941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882436.73994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882436.74026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882436.74061: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882436.74087: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882436.74155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.74178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.74201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.74233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.74253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.74325: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.74340: Evaluated conditional (ansible_distribution_major_version | int < 8): False 8454 1726882436.74343: when evaluation is False, skipping this task 8454 1726882436.74347: _execute() done 8454 1726882436.74352: dumping result to json 8454 1726882436.74357: done dumping result, returning 8454 1726882436.74366: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-f59f-16b9-000000000082] 8454 1726882436.74370: sending task result for task 0affe814-3a2d-f59f-16b9-000000000082 8454 1726882436.74464: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000082 8454 1726882436.74469: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 8454 1726882436.74528: no more pending results, returning what we have 8454 1726882436.74532: results queue empty 8454 1726882436.74533: checking for any_errors_fatal 8454 1726882436.74543: done checking for any_errors_fatal 8454 1726882436.74544: checking for max_fail_percentage 8454 1726882436.74546: done checking for max_fail_percentage 8454 1726882436.74548: checking to see if all hosts have failed and the running result is not ok 8454 1726882436.74549: done checking to see if all hosts have failed 8454 1726882436.74549: getting the remaining hosts for this loop 8454 1726882436.74551: done getting the remaining hosts for this loop 8454 1726882436.74555: getting the next task for host managed_node3 8454 1726882436.74563: done getting next task for host managed_node3 8454 1726882436.74567: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8454 1726882436.74574: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882436.74595: getting variables 8454 1726882436.74598: in VariableManager get_vars() 8454 1726882436.74640: Calling all_inventory to load vars for managed_node3 8454 1726882436.74667: Calling groups_inventory to load vars for managed_node3 8454 1726882436.74671: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882436.74681: Calling all_plugins_play to load vars for managed_node3 8454 1726882436.74684: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882436.74687: Calling groups_plugins_play to load vars for managed_node3 8454 1726882436.75847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.77903: done with get_vars() 8454 1726882436.77924: done getting variables 8454 1726882436.77971: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:56 -0400 (0:00:00.070) 0:00:34.797 ****** 8454 1726882436.78003: entering _queue_task() for managed_node3/fail 8454 1726882436.78237: worker is 1 (out of 1 available) 8454 1726882436.78251: exiting _queue_task() for managed_node3/fail 8454 1726882436.78268: done queuing things up, now waiting for results queue to drain 8454 1726882436.78269: waiting for pending results... 8454 1726882436.78452: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8454 1726882436.78559: in run() - task 0affe814-3a2d-f59f-16b9-000000000083 8454 1726882436.78573: variable 'ansible_search_path' from source: unknown 8454 1726882436.78579: variable 'ansible_search_path' from source: unknown 8454 1726882436.78611: calling self._execute() 8454 1726882436.78685: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.78692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.78704: variable 'omit' from source: magic vars 8454 1726882436.79002: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.79012: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882436.79112: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882436.79303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882436.87141: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882436.87152: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882436.87208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882436.87273: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882436.87300: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882436.87494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.87498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.87501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.87541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.87564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.87632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.87671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.87714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.87771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.87792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.87859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882436.87894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882436.87941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.87997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882436.88018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882436.88247: variable 'network_connections' from source: task vars 8454 1726882436.88262: variable 'port2_profile' from source: play vars 8454 1726882436.88318: variable 'port2_profile' from source: play vars 8454 1726882436.88326: variable 'port1_profile' from source: play vars 8454 1726882436.88393: variable 'port1_profile' from source: play vars 8454 1726882436.88401: variable 'controller_profile' from source: play vars 8454 1726882436.88489: variable 'controller_profile' from source: play vars 8454 1726882436.88524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882436.88700: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882436.88743: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882436.88771: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882436.88796: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882436.88835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882436.88854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882436.88875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882436.88898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882436.88937: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882436.89131: variable 'network_connections' from source: task vars 8454 1726882436.89135: variable 'port2_profile' from source: play vars 8454 1726882436.89188: variable 'port2_profile' from source: play vars 8454 1726882436.89195: variable 'port1_profile' from source: play vars 8454 1726882436.89247: variable 'port1_profile' from source: play vars 8454 1726882436.89257: variable 'controller_profile' from source: play vars 8454 1726882436.89306: variable 'controller_profile' from source: play vars 8454 1726882436.89327: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8454 1726882436.89340: when evaluation is False, skipping this task 8454 1726882436.89343: _execute() done 8454 1726882436.89346: dumping result to json 8454 1726882436.89348: done dumping result, returning 8454 1726882436.89351: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-f59f-16b9-000000000083] 8454 1726882436.89353: sending task result for task 0affe814-3a2d-f59f-16b9-000000000083 8454 1726882436.89452: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000083 8454 1726882436.89455: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8454 1726882436.89517: no more pending results, returning what we have 8454 1726882436.89521: results queue empty 8454 1726882436.89522: checking for any_errors_fatal 8454 1726882436.89528: done checking for any_errors_fatal 8454 1726882436.89529: checking for max_fail_percentage 8454 1726882436.89531: done checking for max_fail_percentage 8454 1726882436.89532: checking to see if all hosts have failed and the running result is not ok 8454 1726882436.89533: done checking to see if all hosts have failed 8454 1726882436.89536: getting the remaining hosts for this loop 8454 1726882436.89538: done getting the remaining hosts for this loop 8454 1726882436.89542: getting the next task for host managed_node3 8454 1726882436.89549: done getting next task for host managed_node3 8454 1726882436.89553: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 8454 1726882436.89558: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882436.89579: getting variables 8454 1726882436.89581: in VariableManager get_vars() 8454 1726882436.89624: Calling all_inventory to load vars for managed_node3 8454 1726882436.89627: Calling groups_inventory to load vars for managed_node3 8454 1726882436.89630: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882436.89645: Calling all_plugins_play to load vars for managed_node3 8454 1726882436.89649: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882436.89653: Calling groups_plugins_play to load vars for managed_node3 8454 1726882436.95651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882436.97163: done with get_vars() 8454 1726882436.97187: done getting variables 8454 1726882436.97226: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:56 -0400 (0:00:00.192) 0:00:34.990 ****** 8454 1726882436.97253: entering _queue_task() for managed_node3/package 8454 1726882436.97517: worker is 1 (out of 1 available) 8454 1726882436.97531: exiting _queue_task() for managed_node3/package 8454 1726882436.97547: done queuing things up, now waiting for results queue to drain 8454 1726882436.97549: waiting for pending results... 8454 1726882436.97739: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 8454 1726882436.97861: in run() - task 0affe814-3a2d-f59f-16b9-000000000084 8454 1726882436.97873: variable 'ansible_search_path' from source: unknown 8454 1726882436.97881: variable 'ansible_search_path' from source: unknown 8454 1726882436.97913: calling self._execute() 8454 1726882436.97988: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882436.98000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882436.98006: variable 'omit' from source: magic vars 8454 1726882436.98323: variable 'ansible_distribution_major_version' from source: facts 8454 1726882436.98336: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882436.98503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882436.98729: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882436.98770: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882436.98836: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882436.98867: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882436.98963: variable 'network_packages' from source: role '' defaults 8454 1726882436.99055: variable '__network_provider_setup' from source: role '' defaults 8454 1726882436.99065: variable '__network_service_name_default_nm' from source: role '' defaults 8454 1726882436.99119: variable '__network_service_name_default_nm' from source: role '' defaults 8454 1726882436.99127: variable '__network_packages_default_nm' from source: role '' defaults 8454 1726882436.99182: variable '__network_packages_default_nm' from source: role '' defaults 8454 1726882436.99344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882437.00880: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882437.00932: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882437.00966: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882437.00998: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882437.01028: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882437.01097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.01121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.01144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.01183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.01194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.01232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.01256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.01283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.01314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.01326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.01514: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8454 1726882437.01605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.01626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.01649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.01682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.01693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.01770: variable 'ansible_python' from source: facts 8454 1726882437.01792: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8454 1726882437.01860: variable '__network_wpa_supplicant_required' from source: role '' defaults 8454 1726882437.01925: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8454 1726882437.02030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.02054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.02075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.02108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.02120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.02339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.02352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.02355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.02358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.02361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.02520: variable 'network_connections' from source: task vars 8454 1726882437.02538: variable 'port2_profile' from source: play vars 8454 1726882437.02665: variable 'port2_profile' from source: play vars 8454 1726882437.02690: variable 'port1_profile' from source: play vars 8454 1726882437.02823: variable 'port1_profile' from source: play vars 8454 1726882437.02883: variable 'controller_profile' from source: play vars 8454 1726882437.02983: variable 'controller_profile' from source: play vars 8454 1726882437.03084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882437.03124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882437.03242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.03246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882437.03353: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882437.03712: variable 'network_connections' from source: task vars 8454 1726882437.03726: variable 'port2_profile' from source: play vars 8454 1726882437.03853: variable 'port2_profile' from source: play vars 8454 1726882437.03871: variable 'port1_profile' from source: play vars 8454 1726882437.04000: variable 'port1_profile' from source: play vars 8454 1726882437.04024: variable 'controller_profile' from source: play vars 8454 1726882437.04153: variable 'controller_profile' from source: play vars 8454 1726882437.04200: variable '__network_packages_default_wireless' from source: role '' defaults 8454 1726882437.04340: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882437.04770: variable 'network_connections' from source: task vars 8454 1726882437.04793: variable 'port2_profile' from source: play vars 8454 1726882437.04893: variable 'port2_profile' from source: play vars 8454 1726882437.04902: variable 'port1_profile' from source: play vars 8454 1726882437.04990: variable 'port1_profile' from source: play vars 8454 1726882437.05109: variable 'controller_profile' from source: play vars 8454 1726882437.05113: variable 'controller_profile' from source: play vars 8454 1726882437.05139: variable '__network_packages_default_team' from source: role '' defaults 8454 1726882437.05253: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882437.05680: variable 'network_connections' from source: task vars 8454 1726882437.05695: variable 'port2_profile' from source: play vars 8454 1726882437.05815: variable 'port2_profile' from source: play vars 8454 1726882437.05831: variable 'port1_profile' from source: play vars 8454 1726882437.05974: variable 'port1_profile' from source: play vars 8454 1726882437.05993: variable 'controller_profile' from source: play vars 8454 1726882437.06080: variable 'controller_profile' from source: play vars 8454 1726882437.06165: variable '__network_service_name_default_initscripts' from source: role '' defaults 8454 1726882437.06341: variable '__network_service_name_default_initscripts' from source: role '' defaults 8454 1726882437.06344: variable '__network_packages_default_initscripts' from source: role '' defaults 8454 1726882437.06347: variable '__network_packages_default_initscripts' from source: role '' defaults 8454 1726882437.06685: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8454 1726882437.07547: variable 'network_connections' from source: task vars 8454 1726882437.07562: variable 'port2_profile' from source: play vars 8454 1726882437.07640: variable 'port2_profile' from source: play vars 8454 1726882437.07659: variable 'port1_profile' from source: play vars 8454 1726882437.07743: variable 'port1_profile' from source: play vars 8454 1726882437.07759: variable 'controller_profile' from source: play vars 8454 1726882437.07841: variable 'controller_profile' from source: play vars 8454 1726882437.07885: variable 'ansible_distribution' from source: facts 8454 1726882437.07888: variable '__network_rh_distros' from source: role '' defaults 8454 1726882437.07891: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.07901: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8454 1726882437.08131: variable 'ansible_distribution' from source: facts 8454 1726882437.08146: variable '__network_rh_distros' from source: role '' defaults 8454 1726882437.08210: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.08214: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8454 1726882437.08402: variable 'ansible_distribution' from source: facts 8454 1726882437.08412: variable '__network_rh_distros' from source: role '' defaults 8454 1726882437.08424: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.08477: variable 'network_provider' from source: set_fact 8454 1726882437.08552: variable 'ansible_facts' from source: unknown 8454 1726882437.09671: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 8454 1726882437.09681: when evaluation is False, skipping this task 8454 1726882437.09691: _execute() done 8454 1726882437.09700: dumping result to json 8454 1726882437.09709: done dumping result, returning 8454 1726882437.09722: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-f59f-16b9-000000000084] 8454 1726882437.09738: sending task result for task 0affe814-3a2d-f59f-16b9-000000000084 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 8454 1726882437.09997: no more pending results, returning what we have 8454 1726882437.10002: results queue empty 8454 1726882437.10003: checking for any_errors_fatal 8454 1726882437.10017: done checking for any_errors_fatal 8454 1726882437.10018: checking for max_fail_percentage 8454 1726882437.10021: done checking for max_fail_percentage 8454 1726882437.10023: checking to see if all hosts have failed and the running result is not ok 8454 1726882437.10024: done checking to see if all hosts have failed 8454 1726882437.10025: getting the remaining hosts for this loop 8454 1726882437.10027: done getting the remaining hosts for this loop 8454 1726882437.10032: getting the next task for host managed_node3 8454 1726882437.10042: done getting next task for host managed_node3 8454 1726882437.10052: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8454 1726882437.10057: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882437.10081: getting variables 8454 1726882437.10083: in VariableManager get_vars() 8454 1726882437.10340: Calling all_inventory to load vars for managed_node3 8454 1726882437.10344: Calling groups_inventory to load vars for managed_node3 8454 1726882437.10348: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882437.10356: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000084 8454 1726882437.10359: WORKER PROCESS EXITING 8454 1726882437.10371: Calling all_plugins_play to load vars for managed_node3 8454 1726882437.10375: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882437.10380: Calling groups_plugins_play to load vars for managed_node3 8454 1726882437.12811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882437.16354: done with get_vars() 8454 1726882437.16401: done getting variables 8454 1726882437.16476: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:57 -0400 (0:00:00.192) 0:00:35.182 ****** 8454 1726882437.16520: entering _queue_task() for managed_node3/package 8454 1726882437.16886: worker is 1 (out of 1 available) 8454 1726882437.16901: exiting _queue_task() for managed_node3/package 8454 1726882437.16915: done queuing things up, now waiting for results queue to drain 8454 1726882437.16917: waiting for pending results... 8454 1726882437.17239: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8454 1726882437.17457: in run() - task 0affe814-3a2d-f59f-16b9-000000000085 8454 1726882437.17489: variable 'ansible_search_path' from source: unknown 8454 1726882437.17499: variable 'ansible_search_path' from source: unknown 8454 1726882437.17547: calling self._execute() 8454 1726882437.17654: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882437.17670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882437.17696: variable 'omit' from source: magic vars 8454 1726882437.18198: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.18253: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882437.18652: variable 'network_state' from source: role '' defaults 8454 1726882437.18681: Evaluated conditional (network_state != {}): False 8454 1726882437.18693: when evaluation is False, skipping this task 8454 1726882437.18701: _execute() done 8454 1726882437.18708: dumping result to json 8454 1726882437.18716: done dumping result, returning 8454 1726882437.18729: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-f59f-16b9-000000000085] 8454 1726882437.18791: sending task result for task 0affe814-3a2d-f59f-16b9-000000000085 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882437.18983: no more pending results, returning what we have 8454 1726882437.18987: results queue empty 8454 1726882437.18988: checking for any_errors_fatal 8454 1726882437.18994: done checking for any_errors_fatal 8454 1726882437.18995: checking for max_fail_percentage 8454 1726882437.18997: done checking for max_fail_percentage 8454 1726882437.18998: checking to see if all hosts have failed and the running result is not ok 8454 1726882437.18999: done checking to see if all hosts have failed 8454 1726882437.19000: getting the remaining hosts for this loop 8454 1726882437.19002: done getting the remaining hosts for this loop 8454 1726882437.19006: getting the next task for host managed_node3 8454 1726882437.19014: done getting next task for host managed_node3 8454 1726882437.19018: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8454 1726882437.19022: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882437.19046: getting variables 8454 1726882437.19049: in VariableManager get_vars() 8454 1726882437.19097: Calling all_inventory to load vars for managed_node3 8454 1726882437.19101: Calling groups_inventory to load vars for managed_node3 8454 1726882437.19104: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882437.19118: Calling all_plugins_play to load vars for managed_node3 8454 1726882437.19123: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882437.19127: Calling groups_plugins_play to load vars for managed_node3 8454 1726882437.20166: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000085 8454 1726882437.20170: WORKER PROCESS EXITING 8454 1726882437.22790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882437.26151: done with get_vars() 8454 1726882437.26196: done getting variables 8454 1726882437.26270: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:57 -0400 (0:00:00.097) 0:00:35.280 ****** 8454 1726882437.26320: entering _queue_task() for managed_node3/package 8454 1726882437.26808: worker is 1 (out of 1 available) 8454 1726882437.26822: exiting _queue_task() for managed_node3/package 8454 1726882437.26956: done queuing things up, now waiting for results queue to drain 8454 1726882437.26958: waiting for pending results... 8454 1726882437.27183: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8454 1726882437.27392: in run() - task 0affe814-3a2d-f59f-16b9-000000000086 8454 1726882437.27423: variable 'ansible_search_path' from source: unknown 8454 1726882437.27436: variable 'ansible_search_path' from source: unknown 8454 1726882437.27492: calling self._execute() 8454 1726882437.27629: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882437.27635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882437.27650: variable 'omit' from source: magic vars 8454 1726882437.28142: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.28240: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882437.28330: variable 'network_state' from source: role '' defaults 8454 1726882437.28356: Evaluated conditional (network_state != {}): False 8454 1726882437.28369: when evaluation is False, skipping this task 8454 1726882437.28380: _execute() done 8454 1726882437.28389: dumping result to json 8454 1726882437.28397: done dumping result, returning 8454 1726882437.28409: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-f59f-16b9-000000000086] 8454 1726882437.28421: sending task result for task 0affe814-3a2d-f59f-16b9-000000000086 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882437.28710: no more pending results, returning what we have 8454 1726882437.28716: results queue empty 8454 1726882437.28717: checking for any_errors_fatal 8454 1726882437.28729: done checking for any_errors_fatal 8454 1726882437.28730: checking for max_fail_percentage 8454 1726882437.28733: done checking for max_fail_percentage 8454 1726882437.28736: checking to see if all hosts have failed and the running result is not ok 8454 1726882437.28737: done checking to see if all hosts have failed 8454 1726882437.28738: getting the remaining hosts for this loop 8454 1726882437.28741: done getting the remaining hosts for this loop 8454 1726882437.28745: getting the next task for host managed_node3 8454 1726882437.28754: done getting next task for host managed_node3 8454 1726882437.28758: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8454 1726882437.28764: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882437.28794: getting variables 8454 1726882437.28796: in VariableManager get_vars() 8454 1726882437.29274: Calling all_inventory to load vars for managed_node3 8454 1726882437.29278: Calling groups_inventory to load vars for managed_node3 8454 1726882437.29282: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882437.29371: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000086 8454 1726882437.29375: WORKER PROCESS EXITING 8454 1726882437.29389: Calling all_plugins_play to load vars for managed_node3 8454 1726882437.29393: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882437.29398: Calling groups_plugins_play to load vars for managed_node3 8454 1726882437.32587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882437.35807: done with get_vars() 8454 1726882437.35946: done getting variables 8454 1726882437.36014: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:57 -0400 (0:00:00.098) 0:00:35.379 ****** 8454 1726882437.36242: entering _queue_task() for managed_node3/service 8454 1726882437.36876: worker is 1 (out of 1 available) 8454 1726882437.36892: exiting _queue_task() for managed_node3/service 8454 1726882437.36905: done queuing things up, now waiting for results queue to drain 8454 1726882437.36906: waiting for pending results... 8454 1726882437.37192: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8454 1726882437.37396: in run() - task 0affe814-3a2d-f59f-16b9-000000000087 8454 1726882437.37483: variable 'ansible_search_path' from source: unknown 8454 1726882437.37487: variable 'ansible_search_path' from source: unknown 8454 1726882437.37490: calling self._execute() 8454 1726882437.37588: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882437.37607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882437.37627: variable 'omit' from source: magic vars 8454 1726882437.38089: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.38110: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882437.38285: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882437.38570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882437.41978: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882437.42140: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882437.42143: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882437.42178: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882437.42218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882437.42321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.42365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.42414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.42473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.42498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.42572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.42616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.42726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.42732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.42741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.42799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.42844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.42882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.42942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.42971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.43216: variable 'network_connections' from source: task vars 8454 1726882437.43238: variable 'port2_profile' from source: play vars 8454 1726882437.43329: variable 'port2_profile' from source: play vars 8454 1726882437.43350: variable 'port1_profile' from source: play vars 8454 1726882437.43489: variable 'port1_profile' from source: play vars 8454 1726882437.43495: variable 'controller_profile' from source: play vars 8454 1726882437.43539: variable 'controller_profile' from source: play vars 8454 1726882437.43644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882437.43879: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882437.43937: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882437.43982: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882437.44022: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882437.44087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882437.44142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882437.44167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.44206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882437.44361: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882437.44648: variable 'network_connections' from source: task vars 8454 1726882437.44660: variable 'port2_profile' from source: play vars 8454 1726882437.44746: variable 'port2_profile' from source: play vars 8454 1726882437.44762: variable 'port1_profile' from source: play vars 8454 1726882437.44850: variable 'port1_profile' from source: play vars 8454 1726882437.44864: variable 'controller_profile' from source: play vars 8454 1726882437.44951: variable 'controller_profile' from source: play vars 8454 1726882437.44986: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8454 1726882437.45003: when evaluation is False, skipping this task 8454 1726882437.45020: _execute() done 8454 1726882437.45036: dumping result to json 8454 1726882437.45046: done dumping result, returning 8454 1726882437.45060: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-f59f-16b9-000000000087] 8454 1726882437.45125: sending task result for task 0affe814-3a2d-f59f-16b9-000000000087 8454 1726882437.45443: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000087 8454 1726882437.45448: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8454 1726882437.45496: no more pending results, returning what we have 8454 1726882437.45500: results queue empty 8454 1726882437.45501: checking for any_errors_fatal 8454 1726882437.45506: done checking for any_errors_fatal 8454 1726882437.45508: checking for max_fail_percentage 8454 1726882437.45510: done checking for max_fail_percentage 8454 1726882437.45511: checking to see if all hosts have failed and the running result is not ok 8454 1726882437.45512: done checking to see if all hosts have failed 8454 1726882437.45513: getting the remaining hosts for this loop 8454 1726882437.45515: done getting the remaining hosts for this loop 8454 1726882437.45519: getting the next task for host managed_node3 8454 1726882437.45526: done getting next task for host managed_node3 8454 1726882437.45531: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8454 1726882437.45537: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882437.45556: getting variables 8454 1726882437.45638: in VariableManager get_vars() 8454 1726882437.45688: Calling all_inventory to load vars for managed_node3 8454 1726882437.45692: Calling groups_inventory to load vars for managed_node3 8454 1726882437.45695: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882437.45705: Calling all_plugins_play to load vars for managed_node3 8454 1726882437.45709: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882437.45713: Calling groups_plugins_play to load vars for managed_node3 8454 1726882437.48071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882437.51037: done with get_vars() 8454 1726882437.51070: done getting variables 8454 1726882437.51147: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:57 -0400 (0:00:00.150) 0:00:35.529 ****** 8454 1726882437.51191: entering _queue_task() for managed_node3/service 8454 1726882437.51597: worker is 1 (out of 1 available) 8454 1726882437.51611: exiting _queue_task() for managed_node3/service 8454 1726882437.51624: done queuing things up, now waiting for results queue to drain 8454 1726882437.51625: waiting for pending results... 8454 1726882437.51895: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8454 1726882437.52069: in run() - task 0affe814-3a2d-f59f-16b9-000000000088 8454 1726882437.52086: variable 'ansible_search_path' from source: unknown 8454 1726882437.52090: variable 'ansible_search_path' from source: unknown 8454 1726882437.52141: calling self._execute() 8454 1726882437.52246: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882437.52253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882437.52267: variable 'omit' from source: magic vars 8454 1726882437.52721: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.52735: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882437.52967: variable 'network_provider' from source: set_fact 8454 1726882437.52973: variable 'network_state' from source: role '' defaults 8454 1726882437.52996: Evaluated conditional (network_provider == "nm" or network_state != {}): True 8454 1726882437.53003: variable 'omit' from source: magic vars 8454 1726882437.53092: variable 'omit' from source: magic vars 8454 1726882437.53136: variable 'network_service_name' from source: role '' defaults 8454 1726882437.53222: variable 'network_service_name' from source: role '' defaults 8454 1726882437.53372: variable '__network_provider_setup' from source: role '' defaults 8454 1726882437.53382: variable '__network_service_name_default_nm' from source: role '' defaults 8454 1726882437.53464: variable '__network_service_name_default_nm' from source: role '' defaults 8454 1726882437.53473: variable '__network_packages_default_nm' from source: role '' defaults 8454 1726882437.53568: variable '__network_packages_default_nm' from source: role '' defaults 8454 1726882437.53892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882437.57313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882437.57605: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882437.57609: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882437.57613: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882437.57617: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882437.57652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.57698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.57738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.57799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.57858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.57917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.57955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.58057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.58116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.58132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.58703: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8454 1726882437.58870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.58902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.58940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.58992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.59009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.59128: variable 'ansible_python' from source: facts 8454 1726882437.59243: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8454 1726882437.59498: variable '__network_wpa_supplicant_required' from source: role '' defaults 8454 1726882437.59839: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8454 1726882437.60073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.60104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.60340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.60344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.60347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.60441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882437.60481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882437.60509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.60561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882437.60587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882437.60783: variable 'network_connections' from source: task vars 8454 1726882437.60799: variable 'port2_profile' from source: play vars 8454 1726882437.60908: variable 'port2_profile' from source: play vars 8454 1726882437.60922: variable 'port1_profile' from source: play vars 8454 1726882437.61042: variable 'port1_profile' from source: play vars 8454 1726882437.61063: variable 'controller_profile' from source: play vars 8454 1726882437.61158: variable 'controller_profile' from source: play vars 8454 1726882437.61290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882437.61545: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882437.61609: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882437.61658: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882437.61713: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882437.61792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882437.61826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882437.61867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882437.61914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882437.61971: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882437.62539: variable 'network_connections' from source: task vars 8454 1726882437.62542: variable 'port2_profile' from source: play vars 8454 1726882437.62545: variable 'port2_profile' from source: play vars 8454 1726882437.62547: variable 'port1_profile' from source: play vars 8454 1726882437.62570: variable 'port1_profile' from source: play vars 8454 1726882437.62584: variable 'controller_profile' from source: play vars 8454 1726882437.62675: variable 'controller_profile' from source: play vars 8454 1726882437.62715: variable '__network_packages_default_wireless' from source: role '' defaults 8454 1726882437.62821: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882437.63440: variable 'network_connections' from source: task vars 8454 1726882437.63443: variable 'port2_profile' from source: play vars 8454 1726882437.63446: variable 'port2_profile' from source: play vars 8454 1726882437.63448: variable 'port1_profile' from source: play vars 8454 1726882437.63469: variable 'port1_profile' from source: play vars 8454 1726882437.63481: variable 'controller_profile' from source: play vars 8454 1726882437.63591: variable 'controller_profile' from source: play vars 8454 1726882437.63624: variable '__network_packages_default_team' from source: role '' defaults 8454 1726882437.63727: variable '__network_team_connections_defined' from source: role '' defaults 8454 1726882437.64293: variable 'network_connections' from source: task vars 8454 1726882437.64299: variable 'port2_profile' from source: play vars 8454 1726882437.64392: variable 'port2_profile' from source: play vars 8454 1726882437.64401: variable 'port1_profile' from source: play vars 8454 1726882437.64486: variable 'port1_profile' from source: play vars 8454 1726882437.64503: variable 'controller_profile' from source: play vars 8454 1726882437.64744: variable 'controller_profile' from source: play vars 8454 1726882437.64748: variable '__network_service_name_default_initscripts' from source: role '' defaults 8454 1726882437.64751: variable '__network_service_name_default_initscripts' from source: role '' defaults 8454 1726882437.64753: variable '__network_packages_default_initscripts' from source: role '' defaults 8454 1726882437.64819: variable '__network_packages_default_initscripts' from source: role '' defaults 8454 1726882437.65191: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8454 1726882437.66027: variable 'network_connections' from source: task vars 8454 1726882437.66033: variable 'port2_profile' from source: play vars 8454 1726882437.66114: variable 'port2_profile' from source: play vars 8454 1726882437.66123: variable 'port1_profile' from source: play vars 8454 1726882437.66223: variable 'port1_profile' from source: play vars 8454 1726882437.66232: variable 'controller_profile' from source: play vars 8454 1726882437.66318: variable 'controller_profile' from source: play vars 8454 1726882437.66329: variable 'ansible_distribution' from source: facts 8454 1726882437.66333: variable '__network_rh_distros' from source: role '' defaults 8454 1726882437.66548: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.66567: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8454 1726882437.66982: variable 'ansible_distribution' from source: facts 8454 1726882437.66986: variable '__network_rh_distros' from source: role '' defaults 8454 1726882437.66991: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.66999: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8454 1726882437.67387: variable 'ansible_distribution' from source: facts 8454 1726882437.67391: variable '__network_rh_distros' from source: role '' defaults 8454 1726882437.67398: variable 'ansible_distribution_major_version' from source: facts 8454 1726882437.67647: variable 'network_provider' from source: set_fact 8454 1726882437.67650: variable 'omit' from source: magic vars 8454 1726882437.67781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882437.67814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882437.67837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882437.67976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882437.67993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882437.68028: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882437.68032: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882437.68096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882437.68343: Set connection var ansible_connection to ssh 8454 1726882437.68539: Set connection var ansible_shell_executable to /bin/sh 8454 1726882437.68542: Set connection var ansible_timeout to 10 8454 1726882437.68545: Set connection var ansible_shell_type to sh 8454 1726882437.68548: Set connection var ansible_pipelining to False 8454 1726882437.68551: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882437.68553: variable 'ansible_shell_executable' from source: unknown 8454 1726882437.68555: variable 'ansible_connection' from source: unknown 8454 1726882437.68557: variable 'ansible_module_compression' from source: unknown 8454 1726882437.68560: variable 'ansible_shell_type' from source: unknown 8454 1726882437.68562: variable 'ansible_shell_executable' from source: unknown 8454 1726882437.68564: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882437.68566: variable 'ansible_pipelining' from source: unknown 8454 1726882437.68568: variable 'ansible_timeout' from source: unknown 8454 1726882437.68570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882437.68808: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882437.68958: variable 'omit' from source: magic vars 8454 1726882437.68966: starting attempt loop 8454 1726882437.68969: running the handler 8454 1726882437.69179: variable 'ansible_facts' from source: unknown 8454 1726882437.71261: _low_level_execute_command(): starting 8454 1726882437.71265: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882437.72043: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882437.72084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882437.72089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882437.72133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882437.72186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882437.72200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882437.72218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882437.72370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882437.74307: stdout chunk (state=3): >>>/root <<< 8454 1726882437.74523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882437.74528: stdout chunk (state=3): >>><<< 8454 1726882437.74543: stderr chunk (state=3): >>><<< 8454 1726882437.74557: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882437.74571: _low_level_execute_command(): starting 8454 1726882437.74580: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769 `" && echo ansible-tmp-1726882437.7455833-9780-61896177428769="` echo /root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769 `" ) && sleep 0' 8454 1726882437.75439: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882437.75454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882437.75469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882437.75492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882437.75600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882437.75622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882437.75773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882437.77922: stdout chunk (state=3): >>>ansible-tmp-1726882437.7455833-9780-61896177428769=/root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769 <<< 8454 1726882437.78056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882437.78151: stderr chunk (state=3): >>><<< 8454 1726882437.78168: stdout chunk (state=3): >>><<< 8454 1726882437.78249: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882437.7455833-9780-61896177428769=/root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882437.78352: variable 'ansible_module_compression' from source: unknown 8454 1726882437.78590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8454 1726882437.78943: variable 'ansible_facts' from source: unknown 8454 1726882437.79156: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/AnsiballZ_systemd.py 8454 1726882437.79538: Sending initial data 8454 1726882437.79542: Sent initial data (153 bytes) 8454 1726882437.80302: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882437.80313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882437.80324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882437.80429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882437.80460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882437.80605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882437.82398: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882437.82625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882437.82860: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp99yuno_t /root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/AnsiballZ_systemd.py <<< 8454 1726882437.82864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/AnsiballZ_systemd.py" <<< 8454 1726882437.82881: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmp99yuno_t" to remote "/root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/AnsiballZ_systemd.py" <<< 8454 1726882437.85743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882437.85850: stderr chunk (state=3): >>><<< 8454 1726882437.85861: stdout chunk (state=3): >>><<< 8454 1726882437.85991: done transferring module to remote 8454 1726882437.85995: _low_level_execute_command(): starting 8454 1726882437.85998: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/ /root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/AnsiballZ_systemd.py && sleep 0' 8454 1726882437.86523: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882437.86541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882437.86556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882437.86579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882437.86598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882437.86611: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882437.86625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882437.86654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882437.86753: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882437.86766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882437.86914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882437.88911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882437.89000: stderr chunk (state=3): >>><<< 8454 1726882437.89010: stdout chunk (state=3): >>><<< 8454 1726882437.89031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882437.89044: _low_level_execute_command(): starting 8454 1726882437.89055: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/AnsiballZ_systemd.py && sleep 0' 8454 1726882437.89710: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882437.89725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882437.89746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882437.89817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882437.89881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882437.89901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882437.89943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882437.90100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882438.23033: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "653", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ExecMainStartTimestampMonotonic": "18094121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "653", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11821056", "MemoryAvailable": "infinity", "CPUUsageNSec": "760050000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "inf<<< 8454 1726882438.23085: stdout chunk (state=3): >>>inity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target multi-user.target network.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service dbus.socket basic.target network-pre.target system.slice sysinit.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:41 EDT", "StateChangeTimestampMonotonic": "505811565", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:33 EDT", "InactiveExitTimestampMonotonic": "18094364", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:34 EDT", "ActiveEnterTimestampMonotonic": "18531095", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ConditionTimestampMonotonic": "18086405", "AssertTimestamp": "Fri 2024-09-20 21:24:33 EDT", "AssertTimestampMonotonic": "18086408", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1c8adba7025b47b4adeb74e368331c9f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8454 1726882438.25201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882438.25205: stdout chunk (state=3): >>><<< 8454 1726882438.25207: stderr chunk (state=3): >>><<< 8454 1726882438.25224: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "653", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ExecMainStartTimestampMonotonic": "18094121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "653", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11821056", "MemoryAvailable": "infinity", "CPUUsageNSec": "760050000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target multi-user.target network.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service dbus.socket basic.target network-pre.target system.slice sysinit.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:41 EDT", "StateChangeTimestampMonotonic": "505811565", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:33 EDT", "InactiveExitTimestampMonotonic": "18094364", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:34 EDT", "ActiveEnterTimestampMonotonic": "18531095", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ConditionTimestampMonotonic": "18086405", "AssertTimestamp": "Fri 2024-09-20 21:24:33 EDT", "AssertTimestampMonotonic": "18086408", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1c8adba7025b47b4adeb74e368331c9f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882438.25602: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882438.25606: _low_level_execute_command(): starting 8454 1726882438.25609: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882437.7455833-9780-61896177428769/ > /dev/null 2>&1 && sleep 0' 8454 1726882438.26351: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882438.26411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882438.26430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882438.26455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882438.26616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882438.28840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882438.28844: stdout chunk (state=3): >>><<< 8454 1726882438.28846: stderr chunk (state=3): >>><<< 8454 1726882438.28849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882438.28851: handler run complete 8454 1726882438.28853: attempt loop complete, returning result 8454 1726882438.28855: _execute() done 8454 1726882438.28857: dumping result to json 8454 1726882438.28882: done dumping result, returning 8454 1726882438.28899: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-f59f-16b9-000000000088] 8454 1726882438.28910: sending task result for task 0affe814-3a2d-f59f-16b9-000000000088 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882438.29588: no more pending results, returning what we have 8454 1726882438.29593: results queue empty 8454 1726882438.29594: checking for any_errors_fatal 8454 1726882438.29602: done checking for any_errors_fatal 8454 1726882438.29603: checking for max_fail_percentage 8454 1726882438.29605: done checking for max_fail_percentage 8454 1726882438.29607: checking to see if all hosts have failed and the running result is not ok 8454 1726882438.29608: done checking to see if all hosts have failed 8454 1726882438.29609: getting the remaining hosts for this loop 8454 1726882438.29611: done getting the remaining hosts for this loop 8454 1726882438.29615: getting the next task for host managed_node3 8454 1726882438.29622: done getting next task for host managed_node3 8454 1726882438.29696: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8454 1726882438.29702: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882438.29716: getting variables 8454 1726882438.29718: in VariableManager get_vars() 8454 1726882438.29768: Calling all_inventory to load vars for managed_node3 8454 1726882438.29772: Calling groups_inventory to load vars for managed_node3 8454 1726882438.29776: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882438.29853: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000088 8454 1726882438.29856: WORKER PROCESS EXITING 8454 1726882438.29868: Calling all_plugins_play to load vars for managed_node3 8454 1726882438.29872: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882438.29876: Calling groups_plugins_play to load vars for managed_node3 8454 1726882438.32336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882438.35735: done with get_vars() 8454 1726882438.35772: done getting variables 8454 1726882438.35852: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:58 -0400 (0:00:00.847) 0:00:36.376 ****** 8454 1726882438.35895: entering _queue_task() for managed_node3/service 8454 1726882438.36307: worker is 1 (out of 1 available) 8454 1726882438.36320: exiting _queue_task() for managed_node3/service 8454 1726882438.36542: done queuing things up, now waiting for results queue to drain 8454 1726882438.36544: waiting for pending results... 8454 1726882438.36664: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8454 1726882438.36891: in run() - task 0affe814-3a2d-f59f-16b9-000000000089 8454 1726882438.36940: variable 'ansible_search_path' from source: unknown 8454 1726882438.36944: variable 'ansible_search_path' from source: unknown 8454 1726882438.36960: calling self._execute() 8454 1726882438.37067: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882438.37090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882438.37115: variable 'omit' from source: magic vars 8454 1726882438.37638: variable 'ansible_distribution_major_version' from source: facts 8454 1726882438.37648: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882438.37784: variable 'network_provider' from source: set_fact 8454 1726882438.37796: Evaluated conditional (network_provider == "nm"): True 8454 1726882438.37926: variable '__network_wpa_supplicant_required' from source: role '' defaults 8454 1726882438.38051: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8454 1726882438.38313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882438.41432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882438.41531: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882438.41589: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882438.41652: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882438.41688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882438.41808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882438.41865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882438.41903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882438.42044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882438.42049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882438.42073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882438.42130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882438.42241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882438.42274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882438.42299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882438.42362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882438.42406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882438.42467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882438.42556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882438.42589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882438.42807: variable 'network_connections' from source: task vars 8454 1726882438.42907: variable 'port2_profile' from source: play vars 8454 1726882438.42922: variable 'port2_profile' from source: play vars 8454 1726882438.42943: variable 'port1_profile' from source: play vars 8454 1726882438.43028: variable 'port1_profile' from source: play vars 8454 1726882438.43124: variable 'controller_profile' from source: play vars 8454 1726882438.43129: variable 'controller_profile' from source: play vars 8454 1726882438.43225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8454 1726882438.43462: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8454 1726882438.43541: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8454 1726882438.43571: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8454 1726882438.43616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8454 1726882438.43682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8454 1726882438.43778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8454 1726882438.43786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882438.43801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8454 1726882438.43869: variable '__network_wireless_connections_defined' from source: role '' defaults 8454 1726882438.44261: variable 'network_connections' from source: task vars 8454 1726882438.44274: variable 'port2_profile' from source: play vars 8454 1726882438.44360: variable 'port2_profile' from source: play vars 8454 1726882438.44374: variable 'port1_profile' from source: play vars 8454 1726882438.44469: variable 'port1_profile' from source: play vars 8454 1726882438.44556: variable 'controller_profile' from source: play vars 8454 1726882438.44582: variable 'controller_profile' from source: play vars 8454 1726882438.44640: Evaluated conditional (__network_wpa_supplicant_required): False 8454 1726882438.44644: when evaluation is False, skipping this task 8454 1726882438.44670: _execute() done 8454 1726882438.44677: dumping result to json 8454 1726882438.44680: done dumping result, returning 8454 1726882438.44725: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-f59f-16b9-000000000089] 8454 1726882438.44728: sending task result for task 0affe814-3a2d-f59f-16b9-000000000089 8454 1726882438.44961: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000089 8454 1726882438.44965: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 8454 1726882438.45024: no more pending results, returning what we have 8454 1726882438.45029: results queue empty 8454 1726882438.45030: checking for any_errors_fatal 8454 1726882438.45063: done checking for any_errors_fatal 8454 1726882438.45064: checking for max_fail_percentage 8454 1726882438.45067: done checking for max_fail_percentage 8454 1726882438.45069: checking to see if all hosts have failed and the running result is not ok 8454 1726882438.45070: done checking to see if all hosts have failed 8454 1726882438.45071: getting the remaining hosts for this loop 8454 1726882438.45117: done getting the remaining hosts for this loop 8454 1726882438.45129: getting the next task for host managed_node3 8454 1726882438.45166: done getting next task for host managed_node3 8454 1726882438.45171: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 8454 1726882438.45340: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882438.45361: getting variables 8454 1726882438.45362: in VariableManager get_vars() 8454 1726882438.45405: Calling all_inventory to load vars for managed_node3 8454 1726882438.45408: Calling groups_inventory to load vars for managed_node3 8454 1726882438.45411: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882438.45422: Calling all_plugins_play to load vars for managed_node3 8454 1726882438.45426: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882438.45430: Calling groups_plugins_play to load vars for managed_node3 8454 1726882438.48711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882438.52977: done with get_vars() 8454 1726882438.53017: done getting variables 8454 1726882438.53093: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:58 -0400 (0:00:00.172) 0:00:36.548 ****** 8454 1726882438.53131: entering _queue_task() for managed_node3/service 8454 1726882438.53531: worker is 1 (out of 1 available) 8454 1726882438.53549: exiting _queue_task() for managed_node3/service 8454 1726882438.53567: done queuing things up, now waiting for results queue to drain 8454 1726882438.53568: waiting for pending results... 8454 1726882438.53937: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 8454 1726882438.54014: in run() - task 0affe814-3a2d-f59f-16b9-00000000008a 8454 1726882438.54031: variable 'ansible_search_path' from source: unknown 8454 1726882438.54037: variable 'ansible_search_path' from source: unknown 8454 1726882438.54079: calling self._execute() 8454 1726882438.54245: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882438.54250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882438.54254: variable 'omit' from source: magic vars 8454 1726882438.54683: variable 'ansible_distribution_major_version' from source: facts 8454 1726882438.54687: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882438.54984: variable 'network_provider' from source: set_fact 8454 1726882438.54988: Evaluated conditional (network_provider == "initscripts"): False 8454 1726882438.54990: when evaluation is False, skipping this task 8454 1726882438.54993: _execute() done 8454 1726882438.54995: dumping result to json 8454 1726882438.54997: done dumping result, returning 8454 1726882438.55021: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-f59f-16b9-00000000008a] 8454 1726882438.55049: sending task result for task 0affe814-3a2d-f59f-16b9-00000000008a 8454 1726882438.55160: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000008a 8454 1726882438.55163: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8454 1726882438.55265: no more pending results, returning what we have 8454 1726882438.55269: results queue empty 8454 1726882438.55270: checking for any_errors_fatal 8454 1726882438.55278: done checking for any_errors_fatal 8454 1726882438.55279: checking for max_fail_percentage 8454 1726882438.55281: done checking for max_fail_percentage 8454 1726882438.55282: checking to see if all hosts have failed and the running result is not ok 8454 1726882438.55283: done checking to see if all hosts have failed 8454 1726882438.55283: getting the remaining hosts for this loop 8454 1726882438.55285: done getting the remaining hosts for this loop 8454 1726882438.55289: getting the next task for host managed_node3 8454 1726882438.55296: done getting next task for host managed_node3 8454 1726882438.55300: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8454 1726882438.55305: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882438.55322: getting variables 8454 1726882438.55324: in VariableManager get_vars() 8454 1726882438.55554: Calling all_inventory to load vars for managed_node3 8454 1726882438.55562: Calling groups_inventory to load vars for managed_node3 8454 1726882438.55566: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882438.55580: Calling all_plugins_play to load vars for managed_node3 8454 1726882438.55585: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882438.55589: Calling groups_plugins_play to load vars for managed_node3 8454 1726882438.58293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882438.61645: done with get_vars() 8454 1726882438.61667: done getting variables 8454 1726882438.61720: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:58 -0400 (0:00:00.086) 0:00:36.635 ****** 8454 1726882438.61755: entering _queue_task() for managed_node3/copy 8454 1726882438.62003: worker is 1 (out of 1 available) 8454 1726882438.62018: exiting _queue_task() for managed_node3/copy 8454 1726882438.62032: done queuing things up, now waiting for results queue to drain 8454 1726882438.62036: waiting for pending results... 8454 1726882438.62237: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8454 1726882438.62356: in run() - task 0affe814-3a2d-f59f-16b9-00000000008b 8454 1726882438.62373: variable 'ansible_search_path' from source: unknown 8454 1726882438.62378: variable 'ansible_search_path' from source: unknown 8454 1726882438.62408: calling self._execute() 8454 1726882438.62495: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882438.62500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882438.62510: variable 'omit' from source: magic vars 8454 1726882438.62828: variable 'ansible_distribution_major_version' from source: facts 8454 1726882438.62840: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882438.62962: variable 'network_provider' from source: set_fact 8454 1726882438.62969: Evaluated conditional (network_provider == "initscripts"): False 8454 1726882438.62972: when evaluation is False, skipping this task 8454 1726882438.62979: _execute() done 8454 1726882438.62982: dumping result to json 8454 1726882438.62985: done dumping result, returning 8454 1726882438.62994: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-f59f-16b9-00000000008b] 8454 1726882438.63000: sending task result for task 0affe814-3a2d-f59f-16b9-00000000008b 8454 1726882438.63145: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000008b skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8454 1726882438.63204: no more pending results, returning what we have 8454 1726882438.63209: results queue empty 8454 1726882438.63210: checking for any_errors_fatal 8454 1726882438.63216: done checking for any_errors_fatal 8454 1726882438.63217: checking for max_fail_percentage 8454 1726882438.63219: done checking for max_fail_percentage 8454 1726882438.63220: checking to see if all hosts have failed and the running result is not ok 8454 1726882438.63221: done checking to see if all hosts have failed 8454 1726882438.63222: getting the remaining hosts for this loop 8454 1726882438.63224: done getting the remaining hosts for this loop 8454 1726882438.63228: getting the next task for host managed_node3 8454 1726882438.63242: done getting next task for host managed_node3 8454 1726882438.63250: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8454 1726882438.63255: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882438.63273: getting variables 8454 1726882438.63275: in VariableManager get_vars() 8454 1726882438.63314: Calling all_inventory to load vars for managed_node3 8454 1726882438.63317: Calling groups_inventory to load vars for managed_node3 8454 1726882438.63320: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882438.63330: Calling all_plugins_play to load vars for managed_node3 8454 1726882438.63333: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882438.63339: Calling groups_plugins_play to load vars for managed_node3 8454 1726882438.63956: WORKER PROCESS EXITING 8454 1726882438.65749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882438.67425: done with get_vars() 8454 1726882438.67450: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:58 -0400 (0:00:00.057) 0:00:36.692 ****** 8454 1726882438.67528: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 8454 1726882438.67819: worker is 1 (out of 1 available) 8454 1726882438.67836: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 8454 1726882438.67851: done queuing things up, now waiting for results queue to drain 8454 1726882438.67852: waiting for pending results... 8454 1726882438.68052: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8454 1726882438.68172: in run() - task 0affe814-3a2d-f59f-16b9-00000000008c 8454 1726882438.68189: variable 'ansible_search_path' from source: unknown 8454 1726882438.68193: variable 'ansible_search_path' from source: unknown 8454 1726882438.68228: calling self._execute() 8454 1726882438.68306: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882438.68310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882438.68323: variable 'omit' from source: magic vars 8454 1726882438.68638: variable 'ansible_distribution_major_version' from source: facts 8454 1726882438.68653: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882438.68657: variable 'omit' from source: magic vars 8454 1726882438.68712: variable 'omit' from source: magic vars 8454 1726882438.68871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8454 1726882438.71123: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8454 1726882438.71180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8454 1726882438.71210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8454 1726882438.71243: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8454 1726882438.71267: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8454 1726882438.71333: variable 'network_provider' from source: set_fact 8454 1726882438.71441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8454 1726882438.71481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8454 1726882438.71501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8454 1726882438.71535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8454 1726882438.71550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8454 1726882438.71612: variable 'omit' from source: magic vars 8454 1726882438.71707: variable 'omit' from source: magic vars 8454 1726882438.71794: variable 'network_connections' from source: task vars 8454 1726882438.71806: variable 'port2_profile' from source: play vars 8454 1726882438.71859: variable 'port2_profile' from source: play vars 8454 1726882438.71868: variable 'port1_profile' from source: play vars 8454 1726882438.71921: variable 'port1_profile' from source: play vars 8454 1726882438.71929: variable 'controller_profile' from source: play vars 8454 1726882438.71983: variable 'controller_profile' from source: play vars 8454 1726882438.72118: variable 'omit' from source: magic vars 8454 1726882438.72128: variable '__lsr_ansible_managed' from source: task vars 8454 1726882438.72181: variable '__lsr_ansible_managed' from source: task vars 8454 1726882438.72328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8454 1726882438.72511: Loaded config def from plugin (lookup/template) 8454 1726882438.72515: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 8454 1726882438.72547: File lookup term: get_ansible_managed.j2 8454 1726882438.72552: variable 'ansible_search_path' from source: unknown 8454 1726882438.72555: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 8454 1726882438.72570: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 8454 1726882438.72585: variable 'ansible_search_path' from source: unknown 8454 1726882438.80041: variable 'ansible_managed' from source: unknown 8454 1726882438.80172: variable 'omit' from source: magic vars 8454 1726882438.80197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882438.80219: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882438.80237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882438.80256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882438.80266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882438.80301: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882438.80305: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882438.80309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882438.80392: Set connection var ansible_connection to ssh 8454 1726882438.80401: Set connection var ansible_shell_executable to /bin/sh 8454 1726882438.80408: Set connection var ansible_timeout to 10 8454 1726882438.80411: Set connection var ansible_shell_type to sh 8454 1726882438.80420: Set connection var ansible_pipelining to False 8454 1726882438.80426: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882438.80446: variable 'ansible_shell_executable' from source: unknown 8454 1726882438.80450: variable 'ansible_connection' from source: unknown 8454 1726882438.80453: variable 'ansible_module_compression' from source: unknown 8454 1726882438.80457: variable 'ansible_shell_type' from source: unknown 8454 1726882438.80459: variable 'ansible_shell_executable' from source: unknown 8454 1726882438.80464: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882438.80468: variable 'ansible_pipelining' from source: unknown 8454 1726882438.80478: variable 'ansible_timeout' from source: unknown 8454 1726882438.80490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882438.80590: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882438.80601: variable 'omit' from source: magic vars 8454 1726882438.80608: starting attempt loop 8454 1726882438.80611: running the handler 8454 1726882438.80624: _low_level_execute_command(): starting 8454 1726882438.80631: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882438.81159: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882438.81163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882438.81165: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882438.81168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882438.81170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882438.81226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882438.81233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882438.81237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882438.81357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882438.83188: stdout chunk (state=3): >>>/root <<< 8454 1726882438.83298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882438.83346: stderr chunk (state=3): >>><<< 8454 1726882438.83350: stdout chunk (state=3): >>><<< 8454 1726882438.83370: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882438.83390: _low_level_execute_command(): starting 8454 1726882438.83411: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114 `" && echo ansible-tmp-1726882438.8337042-9829-228998413152114="` echo /root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114 `" ) && sleep 0' 8454 1726882438.84225: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882438.84257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882438.84449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882438.84452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882438.84517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882438.86575: stdout chunk (state=3): >>>ansible-tmp-1726882438.8337042-9829-228998413152114=/root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114 <<< 8454 1726882438.86756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882438.86759: stderr chunk (state=3): >>><<< 8454 1726882438.86762: stdout chunk (state=3): >>><<< 8454 1726882438.86780: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882438.8337042-9829-228998413152114=/root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882438.86816: variable 'ansible_module_compression' from source: unknown 8454 1726882438.86854: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 8454 1726882438.86885: variable 'ansible_facts' from source: unknown 8454 1726882438.86971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/AnsiballZ_network_connections.py 8454 1726882438.87183: Sending initial data 8454 1726882438.87187: Sent initial data (166 bytes) 8454 1726882438.88010: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882438.88016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882438.88127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882438.89796: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 8454 1726882438.89802: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882438.89905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882438.90019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpfkke3rpv /root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/AnsiballZ_network_connections.py <<< 8454 1726882438.90028: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/AnsiballZ_network_connections.py" <<< 8454 1726882438.90134: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpfkke3rpv" to remote "/root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/AnsiballZ_network_connections.py" <<< 8454 1726882438.92233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882438.92346: stderr chunk (state=3): >>><<< 8454 1726882438.92350: stdout chunk (state=3): >>><<< 8454 1726882438.92386: done transferring module to remote 8454 1726882438.92404: _low_level_execute_command(): starting 8454 1726882438.92408: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/ /root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/AnsiballZ_network_connections.py && sleep 0' 8454 1726882438.92991: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882438.92994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882438.92997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 8454 1726882438.93000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882438.93055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882438.93058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882438.93196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882438.95253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882438.95256: stdout chunk (state=3): >>><<< 8454 1726882438.95259: stderr chunk (state=3): >>><<< 8454 1726882438.95323: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882438.95327: _low_level_execute_command(): starting 8454 1726882438.95330: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/AnsiballZ_network_connections.py && sleep 0' 8454 1726882438.95868: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882438.95872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882438.95874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882438.95876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882438.95930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882438.95945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882438.96062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882439.53140: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back<<< 8454 1726882439.53195: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/70730f1a-05dc-466f-88aa-5eb27a8fb665: error=unknown <<< 8454 1726882439.55497: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/ca811f30-7831-43b3-b534-83e0530ad93d: error=unknown <<< 8454 1726882439.57405: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/c39326a6-2860-4305-9ba8-6fb920a3fcc0: error=unknown <<< 8454 1726882439.57556: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 8454 1726882439.59751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882439.59796: stderr chunk (state=3): >>><<< 8454 1726882439.59800: stdout chunk (state=3): >>><<< 8454 1726882439.59825: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/70730f1a-05dc-466f-88aa-5eb27a8fb665: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/ca811f30-7831-43b3-b534-83e0530ad93d: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cs2xhib2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/c39326a6-2860-4305-9ba8-6fb920a3fcc0: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882439.60140: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882439.60147: _low_level_execute_command(): starting 8454 1726882439.60151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882438.8337042-9829-228998413152114/ > /dev/null 2>&1 && sleep 0' 8454 1726882439.60983: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882439.61049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882439.61110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882439.61123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882439.61133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882439.61290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882439.63430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882439.63437: stdout chunk (state=3): >>><<< 8454 1726882439.63460: stderr chunk (state=3): >>><<< 8454 1726882439.63636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882439.63640: handler run complete 8454 1726882439.63642: attempt loop complete, returning result 8454 1726882439.63645: _execute() done 8454 1726882439.63647: dumping result to json 8454 1726882439.63649: done dumping result, returning 8454 1726882439.63652: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-f59f-16b9-00000000008c] 8454 1726882439.63655: sending task result for task 0affe814-3a2d-f59f-16b9-00000000008c 8454 1726882439.63748: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000008c 8454 1726882439.63751: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 8454 1726882439.63998: no more pending results, returning what we have 8454 1726882439.64002: results queue empty 8454 1726882439.64003: checking for any_errors_fatal 8454 1726882439.64011: done checking for any_errors_fatal 8454 1726882439.64012: checking for max_fail_percentage 8454 1726882439.64014: done checking for max_fail_percentage 8454 1726882439.64015: checking to see if all hosts have failed and the running result is not ok 8454 1726882439.64016: done checking to see if all hosts have failed 8454 1726882439.64017: getting the remaining hosts for this loop 8454 1726882439.64018: done getting the remaining hosts for this loop 8454 1726882439.64022: getting the next task for host managed_node3 8454 1726882439.64029: done getting next task for host managed_node3 8454 1726882439.64033: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 8454 1726882439.64091: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882439.64105: getting variables 8454 1726882439.64107: in VariableManager get_vars() 8454 1726882439.64149: Calling all_inventory to load vars for managed_node3 8454 1726882439.64153: Calling groups_inventory to load vars for managed_node3 8454 1726882439.64156: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882439.64166: Calling all_plugins_play to load vars for managed_node3 8454 1726882439.64170: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882439.64174: Calling groups_plugins_play to load vars for managed_node3 8454 1726882439.67836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882439.73001: done with get_vars() 8454 1726882439.73049: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:59 -0400 (0:00:01.056) 0:00:37.749 ****** 8454 1726882439.73156: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 8454 1726882439.73659: worker is 1 (out of 1 available) 8454 1726882439.73681: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 8454 1726882439.73697: done queuing things up, now waiting for results queue to drain 8454 1726882439.73698: waiting for pending results... 8454 1726882439.73945: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 8454 1726882439.74136: in run() - task 0affe814-3a2d-f59f-16b9-00000000008d 8454 1726882439.74154: variable 'ansible_search_path' from source: unknown 8454 1726882439.74157: variable 'ansible_search_path' from source: unknown 8454 1726882439.74240: calling self._execute() 8454 1726882439.74331: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882439.74340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882439.74355: variable 'omit' from source: magic vars 8454 1726882439.75091: variable 'ansible_distribution_major_version' from source: facts 8454 1726882439.75095: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882439.75437: variable 'network_state' from source: role '' defaults 8454 1726882439.75452: Evaluated conditional (network_state != {}): False 8454 1726882439.75455: when evaluation is False, skipping this task 8454 1726882439.75458: _execute() done 8454 1726882439.75463: dumping result to json 8454 1726882439.75466: done dumping result, returning 8454 1726882439.75477: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-f59f-16b9-00000000008d] 8454 1726882439.75488: sending task result for task 0affe814-3a2d-f59f-16b9-00000000008d 8454 1726882439.75711: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000008d 8454 1726882439.75714: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8454 1726882439.75809: no more pending results, returning what we have 8454 1726882439.75815: results queue empty 8454 1726882439.75816: checking for any_errors_fatal 8454 1726882439.75830: done checking for any_errors_fatal 8454 1726882439.75832: checking for max_fail_percentage 8454 1726882439.75836: done checking for max_fail_percentage 8454 1726882439.75838: checking to see if all hosts have failed and the running result is not ok 8454 1726882439.75839: done checking to see if all hosts have failed 8454 1726882439.75840: getting the remaining hosts for this loop 8454 1726882439.75842: done getting the remaining hosts for this loop 8454 1726882439.75847: getting the next task for host managed_node3 8454 1726882439.76045: done getting next task for host managed_node3 8454 1726882439.76050: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8454 1726882439.76056: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882439.76084: getting variables 8454 1726882439.76086: in VariableManager get_vars() 8454 1726882439.76132: Calling all_inventory to load vars for managed_node3 8454 1726882439.76251: Calling groups_inventory to load vars for managed_node3 8454 1726882439.76255: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882439.76266: Calling all_plugins_play to load vars for managed_node3 8454 1726882439.76270: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882439.76274: Calling groups_plugins_play to load vars for managed_node3 8454 1726882439.79071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882439.80998: done with get_vars() 8454 1726882439.81021: done getting variables 8454 1726882439.81070: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:59 -0400 (0:00:00.079) 0:00:37.828 ****** 8454 1726882439.81105: entering _queue_task() for managed_node3/debug 8454 1726882439.81360: worker is 1 (out of 1 available) 8454 1726882439.81379: exiting _queue_task() for managed_node3/debug 8454 1726882439.81392: done queuing things up, now waiting for results queue to drain 8454 1726882439.81394: waiting for pending results... 8454 1726882439.81588: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8454 1726882439.81770: in run() - task 0affe814-3a2d-f59f-16b9-00000000008e 8454 1726882439.81774: variable 'ansible_search_path' from source: unknown 8454 1726882439.81776: variable 'ansible_search_path' from source: unknown 8454 1726882439.81810: calling self._execute() 8454 1726882439.82024: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882439.82028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882439.82032: variable 'omit' from source: magic vars 8454 1726882439.82367: variable 'ansible_distribution_major_version' from source: facts 8454 1726882439.82383: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882439.82439: variable 'omit' from source: magic vars 8454 1726882439.82475: variable 'omit' from source: magic vars 8454 1726882439.82520: variable 'omit' from source: magic vars 8454 1726882439.82575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882439.82610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882439.82635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882439.82661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882439.82670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882439.82708: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882439.82711: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882439.82802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882439.82843: Set connection var ansible_connection to ssh 8454 1726882439.82866: Set connection var ansible_shell_executable to /bin/sh 8454 1726882439.82869: Set connection var ansible_timeout to 10 8454 1726882439.82872: Set connection var ansible_shell_type to sh 8454 1726882439.82874: Set connection var ansible_pipelining to False 8454 1726882439.82887: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882439.82923: variable 'ansible_shell_executable' from source: unknown 8454 1726882439.82926: variable 'ansible_connection' from source: unknown 8454 1726882439.82929: variable 'ansible_module_compression' from source: unknown 8454 1726882439.82932: variable 'ansible_shell_type' from source: unknown 8454 1726882439.82936: variable 'ansible_shell_executable' from source: unknown 8454 1726882439.82938: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882439.82940: variable 'ansible_pipelining' from source: unknown 8454 1726882439.82943: variable 'ansible_timeout' from source: unknown 8454 1726882439.82987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882439.83106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882439.83119: variable 'omit' from source: magic vars 8454 1726882439.83125: starting attempt loop 8454 1726882439.83127: running the handler 8454 1726882439.83241: variable '__network_connections_result' from source: set_fact 8454 1726882439.83292: handler run complete 8454 1726882439.83307: attempt loop complete, returning result 8454 1726882439.83310: _execute() done 8454 1726882439.83313: dumping result to json 8454 1726882439.83319: done dumping result, returning 8454 1726882439.83329: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-f59f-16b9-00000000008e] 8454 1726882439.83335: sending task result for task 0affe814-3a2d-f59f-16b9-00000000008e 8454 1726882439.83428: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000008e 8454 1726882439.83431: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 8454 1726882439.83526: no more pending results, returning what we have 8454 1726882439.83529: results queue empty 8454 1726882439.83530: checking for any_errors_fatal 8454 1726882439.83538: done checking for any_errors_fatal 8454 1726882439.83539: checking for max_fail_percentage 8454 1726882439.83541: done checking for max_fail_percentage 8454 1726882439.83542: checking to see if all hosts have failed and the running result is not ok 8454 1726882439.83542: done checking to see if all hosts have failed 8454 1726882439.83543: getting the remaining hosts for this loop 8454 1726882439.83546: done getting the remaining hosts for this loop 8454 1726882439.83549: getting the next task for host managed_node3 8454 1726882439.83556: done getting next task for host managed_node3 8454 1726882439.83560: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8454 1726882439.83564: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882439.83576: getting variables 8454 1726882439.83577: in VariableManager get_vars() 8454 1726882439.83613: Calling all_inventory to load vars for managed_node3 8454 1726882439.83616: Calling groups_inventory to load vars for managed_node3 8454 1726882439.83619: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882439.83628: Calling all_plugins_play to load vars for managed_node3 8454 1726882439.83631: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882439.83642: Calling groups_plugins_play to load vars for managed_node3 8454 1726882439.84820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882439.87242: done with get_vars() 8454 1726882439.87265: done getting variables 8454 1726882439.87317: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:59 -0400 (0:00:00.062) 0:00:37.891 ****** 8454 1726882439.87346: entering _queue_task() for managed_node3/debug 8454 1726882439.87575: worker is 1 (out of 1 available) 8454 1726882439.87592: exiting _queue_task() for managed_node3/debug 8454 1726882439.87607: done queuing things up, now waiting for results queue to drain 8454 1726882439.87609: waiting for pending results... 8454 1726882439.87806: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8454 1726882439.87923: in run() - task 0affe814-3a2d-f59f-16b9-00000000008f 8454 1726882439.87938: variable 'ansible_search_path' from source: unknown 8454 1726882439.87941: variable 'ansible_search_path' from source: unknown 8454 1726882439.87980: calling self._execute() 8454 1726882439.88060: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882439.88067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882439.88080: variable 'omit' from source: magic vars 8454 1726882439.88389: variable 'ansible_distribution_major_version' from source: facts 8454 1726882439.88393: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882439.88401: variable 'omit' from source: magic vars 8454 1726882439.88482: variable 'omit' from source: magic vars 8454 1726882439.88520: variable 'omit' from source: magic vars 8454 1726882439.88566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882439.88608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882439.88632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882439.88661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882439.88664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882439.88699: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882439.88703: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882439.88706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882439.89040: Set connection var ansible_connection to ssh 8454 1726882439.89043: Set connection var ansible_shell_executable to /bin/sh 8454 1726882439.89046: Set connection var ansible_timeout to 10 8454 1726882439.89048: Set connection var ansible_shell_type to sh 8454 1726882439.89051: Set connection var ansible_pipelining to False 8454 1726882439.89053: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882439.89055: variable 'ansible_shell_executable' from source: unknown 8454 1726882439.89057: variable 'ansible_connection' from source: unknown 8454 1726882439.89060: variable 'ansible_module_compression' from source: unknown 8454 1726882439.89062: variable 'ansible_shell_type' from source: unknown 8454 1726882439.89064: variable 'ansible_shell_executable' from source: unknown 8454 1726882439.89066: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882439.89068: variable 'ansible_pipelining' from source: unknown 8454 1726882439.89071: variable 'ansible_timeout' from source: unknown 8454 1726882439.89073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882439.89078: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882439.89082: variable 'omit' from source: magic vars 8454 1726882439.89084: starting attempt loop 8454 1726882439.89087: running the handler 8454 1726882439.89128: variable '__network_connections_result' from source: set_fact 8454 1726882439.89318: variable '__network_connections_result' from source: set_fact 8454 1726882439.89384: handler run complete 8454 1726882439.89427: attempt loop complete, returning result 8454 1726882439.89430: _execute() done 8454 1726882439.89433: dumping result to json 8454 1726882439.89441: done dumping result, returning 8454 1726882439.89455: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-f59f-16b9-00000000008f] 8454 1726882439.89459: sending task result for task 0affe814-3a2d-f59f-16b9-00000000008f ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 8454 1726882439.89742: no more pending results, returning what we have 8454 1726882439.89753: results queue empty 8454 1726882439.89755: checking for any_errors_fatal 8454 1726882439.89760: done checking for any_errors_fatal 8454 1726882439.89761: checking for max_fail_percentage 8454 1726882439.89763: done checking for max_fail_percentage 8454 1726882439.89763: checking to see if all hosts have failed and the running result is not ok 8454 1726882439.89764: done checking to see if all hosts have failed 8454 1726882439.89765: getting the remaining hosts for this loop 8454 1726882439.89767: done getting the remaining hosts for this loop 8454 1726882439.89770: getting the next task for host managed_node3 8454 1726882439.89780: done getting next task for host managed_node3 8454 1726882439.89784: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8454 1726882439.89788: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882439.89798: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000008f 8454 1726882439.89801: WORKER PROCESS EXITING 8454 1726882439.89810: getting variables 8454 1726882439.89811: in VariableManager get_vars() 8454 1726882439.89851: Calling all_inventory to load vars for managed_node3 8454 1726882439.89861: Calling groups_inventory to load vars for managed_node3 8454 1726882439.89865: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882439.89875: Calling all_plugins_play to load vars for managed_node3 8454 1726882439.89886: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882439.89890: Calling groups_plugins_play to load vars for managed_node3 8454 1726882439.91843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882439.93763: done with get_vars() 8454 1726882439.93790: done getting variables 8454 1726882439.93837: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:59 -0400 (0:00:00.065) 0:00:37.956 ****** 8454 1726882439.93865: entering _queue_task() for managed_node3/debug 8454 1726882439.94099: worker is 1 (out of 1 available) 8454 1726882439.94114: exiting _queue_task() for managed_node3/debug 8454 1726882439.94127: done queuing things up, now waiting for results queue to drain 8454 1726882439.94129: waiting for pending results... 8454 1726882439.94306: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8454 1726882439.94423: in run() - task 0affe814-3a2d-f59f-16b9-000000000090 8454 1726882439.94437: variable 'ansible_search_path' from source: unknown 8454 1726882439.94440: variable 'ansible_search_path' from source: unknown 8454 1726882439.94475: calling self._execute() 8454 1726882439.94565: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882439.94645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882439.94649: variable 'omit' from source: magic vars 8454 1726882439.95074: variable 'ansible_distribution_major_version' from source: facts 8454 1726882439.95100: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882439.95263: variable 'network_state' from source: role '' defaults 8454 1726882439.95285: Evaluated conditional (network_state != {}): False 8454 1726882439.95300: when evaluation is False, skipping this task 8454 1726882439.95339: _execute() done 8454 1726882439.95343: dumping result to json 8454 1726882439.95346: done dumping result, returning 8454 1726882439.95349: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-f59f-16b9-000000000090] 8454 1726882439.95351: sending task result for task 0affe814-3a2d-f59f-16b9-000000000090 8454 1726882439.95489: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000090 8454 1726882439.95493: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 8454 1726882439.95557: no more pending results, returning what we have 8454 1726882439.95563: results queue empty 8454 1726882439.95564: checking for any_errors_fatal 8454 1726882439.95579: done checking for any_errors_fatal 8454 1726882439.95580: checking for max_fail_percentage 8454 1726882439.95583: done checking for max_fail_percentage 8454 1726882439.95584: checking to see if all hosts have failed and the running result is not ok 8454 1726882439.95585: done checking to see if all hosts have failed 8454 1726882439.95586: getting the remaining hosts for this loop 8454 1726882439.95588: done getting the remaining hosts for this loop 8454 1726882439.95594: getting the next task for host managed_node3 8454 1726882439.95607: done getting next task for host managed_node3 8454 1726882439.95613: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 8454 1726882439.95618: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882439.95642: getting variables 8454 1726882439.95644: in VariableManager get_vars() 8454 1726882439.95690: Calling all_inventory to load vars for managed_node3 8454 1726882439.95694: Calling groups_inventory to load vars for managed_node3 8454 1726882439.95696: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882439.95707: Calling all_plugins_play to load vars for managed_node3 8454 1726882439.95710: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882439.95714: Calling groups_plugins_play to load vars for managed_node3 8454 1726882439.97075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882439.99074: done with get_vars() 8454 1726882439.99096: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:59 -0400 (0:00:00.053) 0:00:38.009 ****** 8454 1726882439.99178: entering _queue_task() for managed_node3/ping 8454 1726882439.99401: worker is 1 (out of 1 available) 8454 1726882439.99421: exiting _queue_task() for managed_node3/ping 8454 1726882439.99436: done queuing things up, now waiting for results queue to drain 8454 1726882439.99438: waiting for pending results... 8454 1726882439.99774: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 8454 1726882439.99923: in run() - task 0affe814-3a2d-f59f-16b9-000000000091 8454 1726882439.99936: variable 'ansible_search_path' from source: unknown 8454 1726882439.99970: variable 'ansible_search_path' from source: unknown 8454 1726882440.00011: calling self._execute() 8454 1726882440.00067: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882440.00079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882440.00096: variable 'omit' from source: magic vars 8454 1726882440.00541: variable 'ansible_distribution_major_version' from source: facts 8454 1726882440.00601: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882440.00614: variable 'omit' from source: magic vars 8454 1726882440.00674: variable 'omit' from source: magic vars 8454 1726882440.00733: variable 'omit' from source: magic vars 8454 1726882440.00786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882440.00821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882440.00837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882440.00900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882440.00904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882440.00943: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882440.00951: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882440.00954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882440.01097: Set connection var ansible_connection to ssh 8454 1726882440.01106: Set connection var ansible_shell_executable to /bin/sh 8454 1726882440.01109: Set connection var ansible_timeout to 10 8454 1726882440.01112: Set connection var ansible_shell_type to sh 8454 1726882440.01141: Set connection var ansible_pipelining to False 8454 1726882440.01144: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882440.01182: variable 'ansible_shell_executable' from source: unknown 8454 1726882440.01185: variable 'ansible_connection' from source: unknown 8454 1726882440.01189: variable 'ansible_module_compression' from source: unknown 8454 1726882440.01191: variable 'ansible_shell_type' from source: unknown 8454 1726882440.01193: variable 'ansible_shell_executable' from source: unknown 8454 1726882440.01195: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882440.01200: variable 'ansible_pipelining' from source: unknown 8454 1726882440.01204: variable 'ansible_timeout' from source: unknown 8454 1726882440.01207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882440.01494: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8454 1726882440.01499: variable 'omit' from source: magic vars 8454 1726882440.01501: starting attempt loop 8454 1726882440.01504: running the handler 8454 1726882440.01506: _low_level_execute_command(): starting 8454 1726882440.01515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882440.02247: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.02296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882440.02315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.02458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.04287: stdout chunk (state=3): >>>/root <<< 8454 1726882440.04437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.04453: stderr chunk (state=3): >>><<< 8454 1726882440.04457: stdout chunk (state=3): >>><<< 8454 1726882440.04479: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882440.04491: _low_level_execute_command(): starting 8454 1726882440.04497: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538 `" && echo ansible-tmp-1726882440.0447788-9882-143728829222538="` echo /root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538 `" ) && sleep 0' 8454 1726882440.04920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882440.04958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882440.04962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.04965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882440.04974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.05022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882440.05026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.05151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.07257: stdout chunk (state=3): >>>ansible-tmp-1726882440.0447788-9882-143728829222538=/root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538 <<< 8454 1726882440.07400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.07443: stderr chunk (state=3): >>><<< 8454 1726882440.07458: stdout chunk (state=3): >>><<< 8454 1726882440.07501: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882440.0447788-9882-143728829222538=/root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882440.07559: variable 'ansible_module_compression' from source: unknown 8454 1726882440.07627: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 8454 1726882440.07655: variable 'ansible_facts' from source: unknown 8454 1726882440.07710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/AnsiballZ_ping.py 8454 1726882440.07845: Sending initial data 8454 1726882440.07853: Sent initial data (151 bytes) 8454 1726882440.08499: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882440.08503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882440.08505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.08507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882440.08510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.08570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882440.08574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.08701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.10415: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882440.10529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882440.10675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpf7f6iebf /root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/AnsiballZ_ping.py <<< 8454 1726882440.10679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/AnsiballZ_ping.py" <<< 8454 1726882440.10770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpf7f6iebf" to remote "/root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/AnsiballZ_ping.py" <<< 8454 1726882440.12756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.12813: stderr chunk (state=3): >>><<< 8454 1726882440.12827: stdout chunk (state=3): >>><<< 8454 1726882440.12866: done transferring module to remote 8454 1726882440.12888: _low_level_execute_command(): starting 8454 1726882440.12953: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/ /root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/AnsiballZ_ping.py && sleep 0' 8454 1726882440.13869: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882440.13885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882440.13950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.14030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882440.14046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.14144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.16174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.16184: stderr chunk (state=3): >>><<< 8454 1726882440.16187: stdout chunk (state=3): >>><<< 8454 1726882440.16281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882440.16284: _low_level_execute_command(): starting 8454 1726882440.16287: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/AnsiballZ_ping.py && sleep 0' 8454 1726882440.16789: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882440.16794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882440.16796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.16866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.16987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.34328: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 8454 1726882440.36073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882440.36077: stdout chunk (state=3): >>><<< 8454 1726882440.36079: stderr chunk (state=3): >>><<< 8454 1726882440.36082: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882440.36084: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882440.36088: _low_level_execute_command(): starting 8454 1726882440.36090: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882440.0447788-9882-143728829222538/ > /dev/null 2>&1 && sleep 0' 8454 1726882440.36947: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882440.36981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882440.36995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882440.37049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.37123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882440.37148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882440.37162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.37314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.39406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.39434: stdout chunk (state=3): >>><<< 8454 1726882440.39454: stderr chunk (state=3): >>><<< 8454 1726882440.39485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882440.39501: handler run complete 8454 1726882440.39527: attempt loop complete, returning result 8454 1726882440.39542: _execute() done 8454 1726882440.39550: dumping result to json 8454 1726882440.39687: done dumping result, returning 8454 1726882440.39752: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-f59f-16b9-000000000091] 8454 1726882440.39764: sending task result for task 0affe814-3a2d-f59f-16b9-000000000091 8454 1726882440.40008: done sending task result for task 0affe814-3a2d-f59f-16b9-000000000091 8454 1726882440.40011: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 8454 1726882440.40082: no more pending results, returning what we have 8454 1726882440.40086: results queue empty 8454 1726882440.40087: checking for any_errors_fatal 8454 1726882440.40094: done checking for any_errors_fatal 8454 1726882440.40095: checking for max_fail_percentage 8454 1726882440.40097: done checking for max_fail_percentage 8454 1726882440.40098: checking to see if all hosts have failed and the running result is not ok 8454 1726882440.40099: done checking to see if all hosts have failed 8454 1726882440.40100: getting the remaining hosts for this loop 8454 1726882440.40102: done getting the remaining hosts for this loop 8454 1726882440.40106: getting the next task for host managed_node3 8454 1726882440.40116: done getting next task for host managed_node3 8454 1726882440.40119: ^ task is: TASK: meta (role_complete) 8454 1726882440.40140: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882440.40153: getting variables 8454 1726882440.40155: in VariableManager get_vars() 8454 1726882440.40197: Calling all_inventory to load vars for managed_node3 8454 1726882440.40200: Calling groups_inventory to load vars for managed_node3 8454 1726882440.40203: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882440.40213: Calling all_plugins_play to load vars for managed_node3 8454 1726882440.40216: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882440.40219: Calling groups_plugins_play to load vars for managed_node3 8454 1726882440.42584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882440.45381: done with get_vars() 8454 1726882440.45404: done getting variables 8454 1726882440.45477: done queuing things up, now waiting for results queue to drain 8454 1726882440.45479: results queue empty 8454 1726882440.45480: checking for any_errors_fatal 8454 1726882440.45483: done checking for any_errors_fatal 8454 1726882440.45484: checking for max_fail_percentage 8454 1726882440.45485: done checking for max_fail_percentage 8454 1726882440.45485: checking to see if all hosts have failed and the running result is not ok 8454 1726882440.45486: done checking to see if all hosts have failed 8454 1726882440.45486: getting the remaining hosts for this loop 8454 1726882440.45487: done getting the remaining hosts for this loop 8454 1726882440.45489: getting the next task for host managed_node3 8454 1726882440.45493: done getting next task for host managed_node3 8454 1726882440.45495: ^ task is: TASK: Delete the device '{{ controller_device }}' 8454 1726882440.45497: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882440.45499: getting variables 8454 1726882440.45500: in VariableManager get_vars() 8454 1726882440.45512: Calling all_inventory to load vars for managed_node3 8454 1726882440.45514: Calling groups_inventory to load vars for managed_node3 8454 1726882440.45515: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882440.45520: Calling all_plugins_play to load vars for managed_node3 8454 1726882440.45521: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882440.45524: Calling groups_plugins_play to load vars for managed_node3 8454 1726882440.46580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882440.48664: done with get_vars() 8454 1726882440.48686: done getting variables 8454 1726882440.48722: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8454 1726882440.48822: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Friday 20 September 2024 21:34:00 -0400 (0:00:00.496) 0:00:38.506 ****** 8454 1726882440.48850: entering _queue_task() for managed_node3/command 8454 1726882440.49116: worker is 1 (out of 1 available) 8454 1726882440.49132: exiting _queue_task() for managed_node3/command 8454 1726882440.49148: done queuing things up, now waiting for results queue to drain 8454 1726882440.49150: waiting for pending results... 8454 1726882440.49343: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 8454 1726882440.49435: in run() - task 0affe814-3a2d-f59f-16b9-0000000000c1 8454 1726882440.49451: variable 'ansible_search_path' from source: unknown 8454 1726882440.49491: calling self._execute() 8454 1726882440.49565: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882440.49572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882440.49585: variable 'omit' from source: magic vars 8454 1726882440.49905: variable 'ansible_distribution_major_version' from source: facts 8454 1726882440.49917: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882440.49921: variable 'omit' from source: magic vars 8454 1726882440.49944: variable 'omit' from source: magic vars 8454 1726882440.50066: variable 'controller_device' from source: play vars 8454 1726882440.50082: variable 'omit' from source: magic vars 8454 1726882440.50141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882440.50176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882440.50195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882440.50340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882440.50344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882440.50346: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882440.50349: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882440.50352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882440.50404: Set connection var ansible_connection to ssh 8454 1726882440.50415: Set connection var ansible_shell_executable to /bin/sh 8454 1726882440.50423: Set connection var ansible_timeout to 10 8454 1726882440.50427: Set connection var ansible_shell_type to sh 8454 1726882440.50439: Set connection var ansible_pipelining to False 8454 1726882440.50447: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882440.50474: variable 'ansible_shell_executable' from source: unknown 8454 1726882440.50592: variable 'ansible_connection' from source: unknown 8454 1726882440.50595: variable 'ansible_module_compression' from source: unknown 8454 1726882440.50598: variable 'ansible_shell_type' from source: unknown 8454 1726882440.50600: variable 'ansible_shell_executable' from source: unknown 8454 1726882440.50603: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882440.50605: variable 'ansible_pipelining' from source: unknown 8454 1726882440.50607: variable 'ansible_timeout' from source: unknown 8454 1726882440.50610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882440.50840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882440.50844: variable 'omit' from source: magic vars 8454 1726882440.50847: starting attempt loop 8454 1726882440.50849: running the handler 8454 1726882440.50851: _low_level_execute_command(): starting 8454 1726882440.50853: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882440.51480: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882440.51552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.51613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882440.51631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882440.51660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.51817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.53657: stdout chunk (state=3): >>>/root <<< 8454 1726882440.53830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.53844: stdout chunk (state=3): >>><<< 8454 1726882440.53864: stderr chunk (state=3): >>><<< 8454 1726882440.53893: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882440.53916: _low_level_execute_command(): starting 8454 1726882440.53929: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573 `" && echo ansible-tmp-1726882440.5390227-9901-154982709404573="` echo /root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573 `" ) && sleep 0' 8454 1726882440.54603: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882440.54617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882440.54636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882440.54659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882440.54680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882440.54816: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882440.54829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882440.54872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.54985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.57095: stdout chunk (state=3): >>>ansible-tmp-1726882440.5390227-9901-154982709404573=/root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573 <<< 8454 1726882440.57306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.57309: stdout chunk (state=3): >>><<< 8454 1726882440.57312: stderr chunk (state=3): >>><<< 8454 1726882440.57329: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882440.5390227-9901-154982709404573=/root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882440.57371: variable 'ansible_module_compression' from source: unknown 8454 1726882440.57518: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882440.57522: variable 'ansible_facts' from source: unknown 8454 1726882440.57593: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/AnsiballZ_command.py 8454 1726882440.57775: Sending initial data 8454 1726882440.57789: Sent initial data (154 bytes) 8454 1726882440.58586: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882440.58613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.58764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.60501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882440.60616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882440.60739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmplkhdxr95 /root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/AnsiballZ_command.py <<< 8454 1726882440.60742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/AnsiballZ_command.py" <<< 8454 1726882440.60867: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmplkhdxr95" to remote "/root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/AnsiballZ_command.py" <<< 8454 1726882440.62344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.62364: stderr chunk (state=3): >>><<< 8454 1726882440.62500: stdout chunk (state=3): >>><<< 8454 1726882440.62503: done transferring module to remote 8454 1726882440.62505: _low_level_execute_command(): starting 8454 1726882440.62508: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/ /root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/AnsiballZ_command.py && sleep 0' 8454 1726882440.63103: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882440.63205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.63284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882440.63327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.63465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.65471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.65527: stderr chunk (state=3): >>><<< 8454 1726882440.65531: stdout chunk (state=3): >>><<< 8454 1726882440.65535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882440.65539: _low_level_execute_command(): starting 8454 1726882440.65542: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/AnsiballZ_command.py && sleep 0' 8454 1726882440.66245: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882440.66250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882440.66253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882440.66256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.66308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882440.66320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882440.66339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.66494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.85265: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:34:00.842718", "end": "2024-09-20 21:34:00.850482", "delta": "0:00:00.007764", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882440.86898: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.41.238 closed. <<< 8454 1726882440.86981: stderr chunk (state=3): >>><<< 8454 1726882440.86994: stdout chunk (state=3): >>><<< 8454 1726882440.87024: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:34:00.842718", "end": "2024-09-20 21:34:00.850482", "delta": "0:00:00.007764", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.41.238 closed. 8454 1726882440.87064: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882440.87074: _low_level_execute_command(): starting 8454 1726882440.87085: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882440.5390227-9901-154982709404573/ > /dev/null 2>&1 && sleep 0' 8454 1726882440.87663: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882440.87690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.87693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882440.87766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882440.87772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882440.87884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882440.89913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882440.89967: stderr chunk (state=3): >>><<< 8454 1726882440.89971: stdout chunk (state=3): >>><<< 8454 1726882440.89991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882440.89998: handler run complete 8454 1726882440.90040: Evaluated conditional (False): False 8454 1726882440.90044: Evaluated conditional (False): False 8454 1726882440.90058: attempt loop complete, returning result 8454 1726882440.90061: _execute() done 8454 1726882440.90066: dumping result to json 8454 1726882440.90073: done dumping result, returning 8454 1726882440.90084: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0affe814-3a2d-f59f-16b9-0000000000c1] 8454 1726882440.90091: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c1 8454 1726882440.90198: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c1 8454 1726882440.90201: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007764", "end": "2024-09-20 21:34:00.850482", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:34:00.842718" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 8454 1726882440.90284: no more pending results, returning what we have 8454 1726882440.90288: results queue empty 8454 1726882440.90289: checking for any_errors_fatal 8454 1726882440.90292: done checking for any_errors_fatal 8454 1726882440.90293: checking for max_fail_percentage 8454 1726882440.90296: done checking for max_fail_percentage 8454 1726882440.90297: checking to see if all hosts have failed and the running result is not ok 8454 1726882440.90298: done checking to see if all hosts have failed 8454 1726882440.90299: getting the remaining hosts for this loop 8454 1726882440.90301: done getting the remaining hosts for this loop 8454 1726882440.90305: getting the next task for host managed_node3 8454 1726882440.90314: done getting next task for host managed_node3 8454 1726882440.90318: ^ task is: TASK: Remove test interfaces 8454 1726882440.90322: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882440.90328: getting variables 8454 1726882440.90330: in VariableManager get_vars() 8454 1726882440.90418: Calling all_inventory to load vars for managed_node3 8454 1726882440.90422: Calling groups_inventory to load vars for managed_node3 8454 1726882440.90425: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882440.90440: Calling all_plugins_play to load vars for managed_node3 8454 1726882440.90444: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882440.90455: Calling groups_plugins_play to load vars for managed_node3 8454 1726882440.96770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882441.00307: done with get_vars() 8454 1726882441.00360: done getting variables 8454 1726882441.00432: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:34:01 -0400 (0:00:00.516) 0:00:39.022 ****** 8454 1726882441.00472: entering _queue_task() for managed_node3/shell 8454 1726882441.01172: worker is 1 (out of 1 available) 8454 1726882441.01187: exiting _queue_task() for managed_node3/shell 8454 1726882441.01200: done queuing things up, now waiting for results queue to drain 8454 1726882441.01202: waiting for pending results... 8454 1726882441.02038: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 8454 1726882441.02198: in run() - task 0affe814-3a2d-f59f-16b9-0000000000c5 8454 1726882441.02302: variable 'ansible_search_path' from source: unknown 8454 1726882441.02342: variable 'ansible_search_path' from source: unknown 8454 1726882441.02464: calling self._execute() 8454 1726882441.03053: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882441.03058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882441.03061: variable 'omit' from source: magic vars 8454 1726882441.03840: variable 'ansible_distribution_major_version' from source: facts 8454 1726882441.03856: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882441.03862: variable 'omit' from source: magic vars 8454 1726882441.03943: variable 'omit' from source: magic vars 8454 1726882441.04161: variable 'dhcp_interface1' from source: play vars 8454 1726882441.04167: variable 'dhcp_interface2' from source: play vars 8454 1726882441.04195: variable 'omit' from source: magic vars 8454 1726882441.04249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882441.04293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882441.04322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882441.04347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882441.04360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882441.04400: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882441.04404: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882441.04406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882441.04539: Set connection var ansible_connection to ssh 8454 1726882441.04551: Set connection var ansible_shell_executable to /bin/sh 8454 1726882441.04559: Set connection var ansible_timeout to 10 8454 1726882441.04562: Set connection var ansible_shell_type to sh 8454 1726882441.04572: Set connection var ansible_pipelining to False 8454 1726882441.04583: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882441.04607: variable 'ansible_shell_executable' from source: unknown 8454 1726882441.04610: variable 'ansible_connection' from source: unknown 8454 1726882441.04615: variable 'ansible_module_compression' from source: unknown 8454 1726882441.04618: variable 'ansible_shell_type' from source: unknown 8454 1726882441.04623: variable 'ansible_shell_executable' from source: unknown 8454 1726882441.04625: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882441.04640: variable 'ansible_pipelining' from source: unknown 8454 1726882441.04644: variable 'ansible_timeout' from source: unknown 8454 1726882441.04648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882441.04995: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882441.04999: variable 'omit' from source: magic vars 8454 1726882441.05002: starting attempt loop 8454 1726882441.05004: running the handler 8454 1726882441.05007: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882441.05010: _low_level_execute_command(): starting 8454 1726882441.05012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882441.06108: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882441.06151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.06272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.08145: stdout chunk (state=3): >>>/root <<< 8454 1726882441.08448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.08452: stdout chunk (state=3): >>><<< 8454 1726882441.08454: stderr chunk (state=3): >>><<< 8454 1726882441.08457: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882441.08460: _low_level_execute_command(): starting 8454 1726882441.08463: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162 `" && echo ansible-tmp-1726882441.0838556-9916-256484604712162="` echo /root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162 `" ) && sleep 0' 8454 1726882441.09340: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882441.09344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882441.09348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882441.09350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882441.09359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882441.09361: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882441.09363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882441.09365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882441.09367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882441.09369: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8454 1726882441.09371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882441.09374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882441.09376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882441.09378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882441.09380: stderr chunk (state=3): >>>debug2: match found <<< 8454 1726882441.09383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882441.09440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882441.09443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882441.09542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.09844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.11967: stdout chunk (state=3): >>>ansible-tmp-1726882441.0838556-9916-256484604712162=/root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162 <<< 8454 1726882441.12163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.12167: stdout chunk (state=3): >>><<< 8454 1726882441.12176: stderr chunk (state=3): >>><<< 8454 1726882441.12343: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882441.0838556-9916-256484604712162=/root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882441.12393: variable 'ansible_module_compression' from source: unknown 8454 1726882441.12459: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882441.12505: variable 'ansible_facts' from source: unknown 8454 1726882441.12845: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/AnsiballZ_command.py 8454 1726882441.12848: Sending initial data 8454 1726882441.12850: Sent initial data (154 bytes) 8454 1726882441.13359: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882441.13369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882441.13384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882441.13457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882441.13509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882441.13628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882441.13632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.13763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.15491: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882441.15589: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882441.15965: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpcy3zpsur /root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/AnsiballZ_command.py <<< 8454 1726882441.15968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/AnsiballZ_command.py" <<< 8454 1726882441.16077: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpcy3zpsur" to remote "/root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/AnsiballZ_command.py" <<< 8454 1726882441.18632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.18639: stdout chunk (state=3): >>><<< 8454 1726882441.18650: stderr chunk (state=3): >>><<< 8454 1726882441.18682: done transferring module to remote 8454 1726882441.18694: _low_level_execute_command(): starting 8454 1726882441.18701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/ /root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/AnsiballZ_command.py && sleep 0' 8454 1726882441.20098: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882441.20103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882441.20145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882441.20163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.20321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.22364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.22379: stdout chunk (state=3): >>><<< 8454 1726882441.22393: stderr chunk (state=3): >>><<< 8454 1726882441.22461: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882441.22471: _low_level_execute_command(): starting 8454 1726882441.22489: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/AnsiballZ_command.py && sleep 0' 8454 1726882441.23197: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882441.23215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882441.23232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882441.23254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882441.23271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882441.23381: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882441.23452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.23590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.44784: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:34:01.409566", "end": "2024-09-20 21:34:01.443889", "delta": "0:00:00.034323", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882441.46519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882441.46598: stderr chunk (state=3): >>><<< 8454 1726882441.46607: stdout chunk (state=3): >>><<< 8454 1726882441.46912: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:34:01.409566", "end": "2024-09-20 21:34:01.443889", "delta": "0:00:00.034323", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882441.46917: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882441.46924: _low_level_execute_command(): starting 8454 1726882441.46927: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882441.0838556-9916-256484604712162/ > /dev/null 2>&1 && sleep 0' 8454 1726882441.47897: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882441.48254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 8454 1726882441.48268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882441.48289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.48436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.50562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.50648: stdout chunk (state=3): >>><<< 8454 1726882441.50662: stderr chunk (state=3): >>><<< 8454 1726882441.50687: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882441.50702: handler run complete 8454 1726882441.50741: Evaluated conditional (False): False 8454 1726882441.51039: attempt loop complete, returning result 8454 1726882441.51042: _execute() done 8454 1726882441.51044: dumping result to json 8454 1726882441.51047: done dumping result, returning 8454 1726882441.51049: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0affe814-3a2d-f59f-16b9-0000000000c5] 8454 1726882441.51051: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c5 ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.034323", "end": "2024-09-20 21:34:01.443889", "rc": 0, "start": "2024-09-20 21:34:01.409566" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 8454 1726882441.51212: no more pending results, returning what we have 8454 1726882441.51216: results queue empty 8454 1726882441.51217: checking for any_errors_fatal 8454 1726882441.51230: done checking for any_errors_fatal 8454 1726882441.51231: checking for max_fail_percentage 8454 1726882441.51233: done checking for max_fail_percentage 8454 1726882441.51336: checking to see if all hosts have failed and the running result is not ok 8454 1726882441.51338: done checking to see if all hosts have failed 8454 1726882441.51339: getting the remaining hosts for this loop 8454 1726882441.51342: done getting the remaining hosts for this loop 8454 1726882441.51351: getting the next task for host managed_node3 8454 1726882441.51359: done getting next task for host managed_node3 8454 1726882441.51362: ^ task is: TASK: Stop dnsmasq/radvd services 8454 1726882441.51367: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882441.51373: getting variables 8454 1726882441.51375: in VariableManager get_vars() 8454 1726882441.51424: Calling all_inventory to load vars for managed_node3 8454 1726882441.51427: Calling groups_inventory to load vars for managed_node3 8454 1726882441.51430: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882441.51579: Calling all_plugins_play to load vars for managed_node3 8454 1726882441.51583: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882441.51588: Calling groups_plugins_play to load vars for managed_node3 8454 1726882441.52107: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c5 8454 1726882441.52111: WORKER PROCESS EXITING 8454 1726882441.56559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882441.62230: done with get_vars() 8454 1726882441.62477: done getting variables 8454 1726882441.62554: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:34:01 -0400 (0:00:00.621) 0:00:39.643 ****** 8454 1726882441.62595: entering _queue_task() for managed_node3/shell 8454 1726882441.63363: worker is 1 (out of 1 available) 8454 1726882441.63380: exiting _queue_task() for managed_node3/shell 8454 1726882441.63395: done queuing things up, now waiting for results queue to drain 8454 1726882441.63397: waiting for pending results... 8454 1726882441.63863: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 8454 1726882441.64345: in run() - task 0affe814-3a2d-f59f-16b9-0000000000c6 8454 1726882441.64369: variable 'ansible_search_path' from source: unknown 8454 1726882441.64380: variable 'ansible_search_path' from source: unknown 8454 1726882441.64453: calling self._execute() 8454 1726882441.64633: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882441.64839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882441.64844: variable 'omit' from source: magic vars 8454 1726882441.65733: variable 'ansible_distribution_major_version' from source: facts 8454 1726882441.65822: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882441.65837: variable 'omit' from source: magic vars 8454 1726882441.66171: variable 'omit' from source: magic vars 8454 1726882441.66178: variable 'omit' from source: magic vars 8454 1726882441.66181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882441.66313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882441.66418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882441.66525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882441.66940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882441.66944: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882441.66947: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882441.66950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882441.67226: Set connection var ansible_connection to ssh 8454 1726882441.67431: Set connection var ansible_shell_executable to /bin/sh 8454 1726882441.67445: Set connection var ansible_timeout to 10 8454 1726882441.67453: Set connection var ansible_shell_type to sh 8454 1726882441.67468: Set connection var ansible_pipelining to False 8454 1726882441.67482: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882441.67512: variable 'ansible_shell_executable' from source: unknown 8454 1726882441.67590: variable 'ansible_connection' from source: unknown 8454 1726882441.67601: variable 'ansible_module_compression' from source: unknown 8454 1726882441.67612: variable 'ansible_shell_type' from source: unknown 8454 1726882441.67622: variable 'ansible_shell_executable' from source: unknown 8454 1726882441.67633: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882441.67648: variable 'ansible_pipelining' from source: unknown 8454 1726882441.67658: variable 'ansible_timeout' from source: unknown 8454 1726882441.67667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882441.68039: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882441.68059: variable 'omit' from source: magic vars 8454 1726882441.68070: starting attempt loop 8454 1726882441.68081: running the handler 8454 1726882441.68098: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882441.68413: _low_level_execute_command(): starting 8454 1726882441.68417: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882441.69531: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882441.69554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882441.69609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882441.69847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.69924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.71796: stdout chunk (state=3): >>>/root <<< 8454 1726882441.71898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.72075: stderr chunk (state=3): >>><<< 8454 1726882441.72157: stdout chunk (state=3): >>><<< 8454 1726882441.72185: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882441.72208: _low_level_execute_command(): starting 8454 1726882441.72418: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297 `" && echo ansible-tmp-1726882441.7219312-9954-143098312922297="` echo /root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297 `" ) && sleep 0' 8454 1726882441.73597: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882441.73614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882441.73649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882441.73840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8454 1726882441.73899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882441.74041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.74164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.76380: stdout chunk (state=3): >>>ansible-tmp-1726882441.7219312-9954-143098312922297=/root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297 <<< 8454 1726882441.76940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.76944: stdout chunk (state=3): >>><<< 8454 1726882441.76947: stderr chunk (state=3): >>><<< 8454 1726882441.76950: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882441.7219312-9954-143098312922297=/root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882441.76952: variable 'ansible_module_compression' from source: unknown 8454 1726882441.76955: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882441.76958: variable 'ansible_facts' from source: unknown 8454 1726882441.77209: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/AnsiballZ_command.py 8454 1726882441.77591: Sending initial data 8454 1726882441.77595: Sent initial data (154 bytes) 8454 1726882441.78597: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882441.78761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.78890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.80695: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8454 1726882441.80705: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 8454 1726882441.80713: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 8454 1726882441.80722: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 8454 1726882441.80748: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 8454 1726882441.80758: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882441.80895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882441.81022: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpqu_so6mt /root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/AnsiballZ_command.py <<< 8454 1726882441.81026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/AnsiballZ_command.py" <<< 8454 1726882441.81149: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpqu_so6mt" to remote "/root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/AnsiballZ_command.py" <<< 8454 1726882441.83638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.83644: stderr chunk (state=3): >>><<< 8454 1726882441.83646: stdout chunk (state=3): >>><<< 8454 1726882441.83665: done transferring module to remote 8454 1726882441.83690: _low_level_execute_command(): starting 8454 1726882441.83695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/ /root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/AnsiballZ_command.py && sleep 0' 8454 1726882441.84317: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882441.84323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882441.84343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882441.84350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 8454 1726882441.84357: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8454 1726882441.84363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882441.84370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882441.84394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882441.84462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882441.84509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.84627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882441.86985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882441.86994: stdout chunk (state=3): >>><<< 8454 1726882441.86997: stderr chunk (state=3): >>><<< 8454 1726882441.87000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882441.87003: _low_level_execute_command(): starting 8454 1726882441.87006: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/AnsiballZ_command.py && sleep 0' 8454 1726882441.87663: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882441.87667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882441.87670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882441.87753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882441.87771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882441.87782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882441.87801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882441.87976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882442.09238: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:34:02.060348", "end": "2024-09-20 21:34:02.089081", "delta": "0:00:00.028733", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882442.11141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882442.11145: stdout chunk (state=3): >>><<< 8454 1726882442.11148: stderr chunk (state=3): >>><<< 8454 1726882442.11152: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:34:02.060348", "end": "2024-09-20 21:34:02.089081", "delta": "0:00:00.028733", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882442.11163: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882442.11165: _low_level_execute_command(): starting 8454 1726882442.11168: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882441.7219312-9954-143098312922297/ > /dev/null 2>&1 && sleep 0' 8454 1726882442.11857: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882442.11866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882442.11877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882442.11908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882442.11954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 8454 1726882442.11974: stderr chunk (state=3): >>>debug2: match not found <<< 8454 1726882442.11987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882442.12096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882442.12120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882442.12269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882442.14445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882442.14545: stderr chunk (state=3): >>><<< 8454 1726882442.14549: stdout chunk (state=3): >>><<< 8454 1726882442.14603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882442.14612: handler run complete 8454 1726882442.14729: Evaluated conditional (False): False 8454 1726882442.14753: attempt loop complete, returning result 8454 1726882442.14783: _execute() done 8454 1726882442.14792: dumping result to json 8454 1726882442.14814: done dumping result, returning 8454 1726882442.14830: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0affe814-3a2d-f59f-16b9-0000000000c6] 8454 1726882442.14940: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c6 8454 1726882442.15025: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c6 8454 1726882442.15030: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.028733", "end": "2024-09-20 21:34:02.089081", "rc": 0, "start": "2024-09-20 21:34:02.060348" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 8454 1726882442.15125: no more pending results, returning what we have 8454 1726882442.15129: results queue empty 8454 1726882442.15129: checking for any_errors_fatal 8454 1726882442.15199: done checking for any_errors_fatal 8454 1726882442.15201: checking for max_fail_percentage 8454 1726882442.15203: done checking for max_fail_percentage 8454 1726882442.15204: checking to see if all hosts have failed and the running result is not ok 8454 1726882442.15205: done checking to see if all hosts have failed 8454 1726882442.15206: getting the remaining hosts for this loop 8454 1726882442.15207: done getting the remaining hosts for this loop 8454 1726882442.15212: getting the next task for host managed_node3 8454 1726882442.15220: done getting next task for host managed_node3 8454 1726882442.15224: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 8454 1726882442.15227: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882442.15231: getting variables 8454 1726882442.15233: in VariableManager get_vars() 8454 1726882442.15279: Calling all_inventory to load vars for managed_node3 8454 1726882442.15283: Calling groups_inventory to load vars for managed_node3 8454 1726882442.15286: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882442.15362: Calling all_plugins_play to load vars for managed_node3 8454 1726882442.15371: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882442.15375: Calling groups_plugins_play to load vars for managed_node3 8454 1726882442.18629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882442.22593: done with get_vars() 8454 1726882442.22629: done getting variables 8454 1726882442.22702: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Friday 20 September 2024 21:34:02 -0400 (0:00:00.601) 0:00:40.245 ****** 8454 1726882442.22945: entering _queue_task() for managed_node3/command 8454 1726882442.23405: worker is 1 (out of 1 available) 8454 1726882442.23419: exiting _queue_task() for managed_node3/command 8454 1726882442.23433: done queuing things up, now waiting for results queue to drain 8454 1726882442.23552: waiting for pending results... 8454 1726882442.24931: running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript 8454 1726882442.25440: in run() - task 0affe814-3a2d-f59f-16b9-0000000000c7 8454 1726882442.25445: variable 'ansible_search_path' from source: unknown 8454 1726882442.25448: calling self._execute() 8454 1726882442.25540: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882442.25955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882442.25969: variable 'omit' from source: magic vars 8454 1726882442.27116: variable 'ansible_distribution_major_version' from source: facts 8454 1726882442.27120: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882442.27244: variable 'network_provider' from source: set_fact 8454 1726882442.27248: Evaluated conditional (network_provider == "initscripts"): False 8454 1726882442.27255: when evaluation is False, skipping this task 8454 1726882442.27258: _execute() done 8454 1726882442.27261: dumping result to json 8454 1726882442.27265: done dumping result, returning 8454 1726882442.27274: done running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript [0affe814-3a2d-f59f-16b9-0000000000c7] 8454 1726882442.27334: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c7 8454 1726882442.27419: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c7 8454 1726882442.27422: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8454 1726882442.27491: no more pending results, returning what we have 8454 1726882442.27496: results queue empty 8454 1726882442.27497: checking for any_errors_fatal 8454 1726882442.27513: done checking for any_errors_fatal 8454 1726882442.27514: checking for max_fail_percentage 8454 1726882442.27516: done checking for max_fail_percentage 8454 1726882442.27518: checking to see if all hosts have failed and the running result is not ok 8454 1726882442.27519: done checking to see if all hosts have failed 8454 1726882442.27520: getting the remaining hosts for this loop 8454 1726882442.27522: done getting the remaining hosts for this loop 8454 1726882442.27527: getting the next task for host managed_node3 8454 1726882442.27537: done getting next task for host managed_node3 8454 1726882442.27541: ^ task is: TASK: Verify network state restored to default 8454 1726882442.27545: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882442.27551: getting variables 8454 1726882442.27552: in VariableManager get_vars() 8454 1726882442.27597: Calling all_inventory to load vars for managed_node3 8454 1726882442.27601: Calling groups_inventory to load vars for managed_node3 8454 1726882442.27603: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882442.27618: Calling all_plugins_play to load vars for managed_node3 8454 1726882442.27622: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882442.27625: Calling groups_plugins_play to load vars for managed_node3 8454 1726882442.32101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882442.35098: done with get_vars() 8454 1726882442.35138: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Friday 20 September 2024 21:34:02 -0400 (0:00:00.123) 0:00:40.370 ****** 8454 1726882442.35257: entering _queue_task() for managed_node3/include_tasks 8454 1726882442.35771: worker is 1 (out of 1 available) 8454 1726882442.35785: exiting _queue_task() for managed_node3/include_tasks 8454 1726882442.35798: done queuing things up, now waiting for results queue to drain 8454 1726882442.35799: waiting for pending results... 8454 1726882442.36152: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 8454 1726882442.36441: in run() - task 0affe814-3a2d-f59f-16b9-0000000000c8 8454 1726882442.36445: variable 'ansible_search_path' from source: unknown 8454 1726882442.36457: calling self._execute() 8454 1726882442.36780: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882442.36785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882442.36789: variable 'omit' from source: magic vars 8454 1726882442.37545: variable 'ansible_distribution_major_version' from source: facts 8454 1726882442.37655: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882442.37671: _execute() done 8454 1726882442.37680: dumping result to json 8454 1726882442.37691: done dumping result, returning 8454 1726882442.37701: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0affe814-3a2d-f59f-16b9-0000000000c8] 8454 1726882442.37786: sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c8 8454 1726882442.38068: done sending task result for task 0affe814-3a2d-f59f-16b9-0000000000c8 8454 1726882442.38071: WORKER PROCESS EXITING 8454 1726882442.38101: no more pending results, returning what we have 8454 1726882442.38108: in VariableManager get_vars() 8454 1726882442.38163: Calling all_inventory to load vars for managed_node3 8454 1726882442.38166: Calling groups_inventory to load vars for managed_node3 8454 1726882442.38168: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882442.38185: Calling all_plugins_play to load vars for managed_node3 8454 1726882442.38188: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882442.38192: Calling groups_plugins_play to load vars for managed_node3 8454 1726882442.42778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882442.48424: done with get_vars() 8454 1726882442.48462: variable 'ansible_search_path' from source: unknown 8454 1726882442.48480: we have included files to process 8454 1726882442.48482: generating all_blocks data 8454 1726882442.48488: done generating all_blocks data 8454 1726882442.48493: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8454 1726882442.48495: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8454 1726882442.48498: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8454 1726882442.49628: done processing included file 8454 1726882442.49631: iterating over new_blocks loaded from include file 8454 1726882442.49633: in VariableManager get_vars() 8454 1726882442.49661: done with get_vars() 8454 1726882442.49663: filtering new block on tags 8454 1726882442.49710: done filtering new block on tags 8454 1726882442.49713: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 8454 1726882442.49720: extending task lists for all hosts with included blocks 8454 1726882442.53537: done extending task lists 8454 1726882442.53539: done processing included files 8454 1726882442.53540: results queue empty 8454 1726882442.53541: checking for any_errors_fatal 8454 1726882442.53546: done checking for any_errors_fatal 8454 1726882442.53547: checking for max_fail_percentage 8454 1726882442.53549: done checking for max_fail_percentage 8454 1726882442.53550: checking to see if all hosts have failed and the running result is not ok 8454 1726882442.53551: done checking to see if all hosts have failed 8454 1726882442.53552: getting the remaining hosts for this loop 8454 1726882442.53553: done getting the remaining hosts for this loop 8454 1726882442.53557: getting the next task for host managed_node3 8454 1726882442.53563: done getting next task for host managed_node3 8454 1726882442.53566: ^ task is: TASK: Check routes and DNS 8454 1726882442.53570: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882442.53573: getting variables 8454 1726882442.53574: in VariableManager get_vars() 8454 1726882442.53595: Calling all_inventory to load vars for managed_node3 8454 1726882442.53598: Calling groups_inventory to load vars for managed_node3 8454 1726882442.53601: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882442.53609: Calling all_plugins_play to load vars for managed_node3 8454 1726882442.53613: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882442.53616: Calling groups_plugins_play to load vars for managed_node3 8454 1726882442.56507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882442.59674: done with get_vars() 8454 1726882442.59842: done getting variables 8454 1726882442.59896: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:34:02 -0400 (0:00:00.246) 0:00:40.617 ****** 8454 1726882442.59937: entering _queue_task() for managed_node3/shell 8454 1726882442.60695: worker is 1 (out of 1 available) 8454 1726882442.60822: exiting _queue_task() for managed_node3/shell 8454 1726882442.60839: done queuing things up, now waiting for results queue to drain 8454 1726882442.60840: waiting for pending results... 8454 1726882442.61454: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 8454 1726882442.61641: in run() - task 0affe814-3a2d-f59f-16b9-00000000056d 8454 1726882442.61645: variable 'ansible_search_path' from source: unknown 8454 1726882442.61648: variable 'ansible_search_path' from source: unknown 8454 1726882442.61841: calling self._execute() 8454 1726882442.61970: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882442.62032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882442.62058: variable 'omit' from source: magic vars 8454 1726882442.62937: variable 'ansible_distribution_major_version' from source: facts 8454 1726882442.63011: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882442.63140: variable 'omit' from source: magic vars 8454 1726882442.63181: variable 'omit' from source: magic vars 8454 1726882442.63266: variable 'omit' from source: magic vars 8454 1726882442.63369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8454 1726882442.63478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8454 1726882442.63568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8454 1726882442.63600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882442.63669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8454 1726882442.63713: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8454 1726882442.63767: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882442.63780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882442.64141: Set connection var ansible_connection to ssh 8454 1726882442.64144: Set connection var ansible_shell_executable to /bin/sh 8454 1726882442.64147: Set connection var ansible_timeout to 10 8454 1726882442.64149: Set connection var ansible_shell_type to sh 8454 1726882442.64152: Set connection var ansible_pipelining to False 8454 1726882442.64166: Set connection var ansible_module_compression to ZIP_DEFLATED 8454 1726882442.64273: variable 'ansible_shell_executable' from source: unknown 8454 1726882442.64279: variable 'ansible_connection' from source: unknown 8454 1726882442.64282: variable 'ansible_module_compression' from source: unknown 8454 1726882442.64284: variable 'ansible_shell_type' from source: unknown 8454 1726882442.64286: variable 'ansible_shell_executable' from source: unknown 8454 1726882442.64289: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882442.64291: variable 'ansible_pipelining' from source: unknown 8454 1726882442.64294: variable 'ansible_timeout' from source: unknown 8454 1726882442.64296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882442.64629: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882442.64653: variable 'omit' from source: magic vars 8454 1726882442.64708: starting attempt loop 8454 1726882442.64723: running the handler 8454 1726882442.64742: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8454 1726882442.64770: _low_level_execute_command(): starting 8454 1726882442.64825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8454 1726882442.65997: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882442.66030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882442.66129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882442.66225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882442.68549: stdout chunk (state=3): >>>/root <<< 8454 1726882442.68553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882442.68557: stdout chunk (state=3): >>><<< 8454 1726882442.68560: stderr chunk (state=3): >>><<< 8454 1726882442.68564: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882442.68567: _low_level_execute_command(): starting 8454 1726882442.68571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970 `" && echo ansible-tmp-1726882442.68456-9996-14594044757970="` echo /root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970 `" ) && sleep 0' 8454 1726882442.69285: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882442.69301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882442.69316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882442.69346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882442.69446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882442.69469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882442.69616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882442.71874: stdout chunk (state=3): >>>ansible-tmp-1726882442.68456-9996-14594044757970=/root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970 <<< 8454 1726882442.72428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882442.72432: stdout chunk (state=3): >>><<< 8454 1726882442.72436: stderr chunk (state=3): >>><<< 8454 1726882442.72441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882442.68456-9996-14594044757970=/root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882442.72443: variable 'ansible_module_compression' from source: unknown 8454 1726882442.72445: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8454ynh6eu7y/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8454 1726882442.72448: variable 'ansible_facts' from source: unknown 8454 1726882442.72681: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/AnsiballZ_command.py 8454 1726882442.72905: Sending initial data 8454 1726882442.72916: Sent initial data (151 bytes) 8454 1726882442.73509: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882442.73526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882442.73548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882442.73656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882442.73682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882442.73700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882442.73723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882442.73870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882442.75724: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8454 1726882442.75837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8454 1726882442.76063: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpvyrna1ai /root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/AnsiballZ_command.py <<< 8454 1726882442.76066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/AnsiballZ_command.py" <<< 8454 1726882442.76261: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-8454ynh6eu7y/tmpvyrna1ai" to remote "/root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/AnsiballZ_command.py" <<< 8454 1726882442.78218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882442.78317: stderr chunk (state=3): >>><<< 8454 1726882442.78321: stdout chunk (state=3): >>><<< 8454 1726882442.78349: done transferring module to remote 8454 1726882442.78460: _low_level_execute_command(): starting 8454 1726882442.78464: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/ /root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/AnsiballZ_command.py && sleep 0' 8454 1726882442.79383: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882442.79389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882442.79391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8454 1726882442.79395: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 8454 1726882442.79398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882442.79404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882442.79406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882442.79471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882442.79647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882442.81625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882442.81705: stderr chunk (state=3): >>><<< 8454 1726882442.81729: stdout chunk (state=3): >>><<< 8454 1726882442.81759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882442.81770: _low_level_execute_command(): starting 8454 1726882442.81784: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/AnsiballZ_command.py && sleep 0' 8454 1726882442.82420: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 8454 1726882442.82439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8454 1726882442.82456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8454 1726882442.82475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8454 1726882442.82508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 8454 1726882442.82618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882442.82643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882442.82813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882443.01433: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0e:39:03:af:ed:a3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.41.238/22 brd 10.31.43.255 scope global dynamic noprefixroute eth0\n valid_lft 3032sec preferred_lft 3032sec\n inet6 fe80::a0b7:fdc4:48e8:7158/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.40.1 dev eth0 proto dhcp src 10.31.41.238 metric 100 \n10.31.40.0/22 dev eth0 proto kernel scope link src 10.31.41.238 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:34:03.002891", "end": "2024-09-20 21:34:03.012030", "delta": "0:00:00.009139", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8454 1726882443.03741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 8454 1726882443.03745: stdout chunk (state=3): >>><<< 8454 1726882443.03747: stderr chunk (state=3): >>><<< 8454 1726882443.03750: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0e:39:03:af:ed:a3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.41.238/22 brd 10.31.43.255 scope global dynamic noprefixroute eth0\n valid_lft 3032sec preferred_lft 3032sec\n inet6 fe80::a0b7:fdc4:48e8:7158/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.40.1 dev eth0 proto dhcp src 10.31.41.238 metric 100 \n10.31.40.0/22 dev eth0 proto kernel scope link src 10.31.41.238 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:34:03.002891", "end": "2024-09-20 21:34:03.012030", "delta": "0:00:00.009139", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 8454 1726882443.03758: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8454 1726882443.03760: _low_level_execute_command(): starting 8454 1726882443.03762: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882442.68456-9996-14594044757970/ > /dev/null 2>&1 && sleep 0' 8454 1726882443.04749: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8454 1726882443.04779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 8454 1726882443.04805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8454 1726882443.04825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8454 1726882443.05090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8454 1726882443.07102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8454 1726882443.07107: stdout chunk (state=3): >>><<< 8454 1726882443.07127: stderr chunk (state=3): >>><<< 8454 1726882443.07145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8454 1726882443.07153: handler run complete 8454 1726882443.07186: Evaluated conditional (False): False 8454 1726882443.07200: attempt loop complete, returning result 8454 1726882443.07203: _execute() done 8454 1726882443.07208: dumping result to json 8454 1726882443.07217: done dumping result, returning 8454 1726882443.07342: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0affe814-3a2d-f59f-16b9-00000000056d] 8454 1726882443.07355: sending task result for task 0affe814-3a2d-f59f-16b9-00000000056d 8454 1726882443.07503: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000056d 8454 1726882443.07506: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009139", "end": "2024-09-20 21:34:03.012030", "rc": 0, "start": "2024-09-20 21:34:03.002891" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0e:39:03:af:ed:a3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.41.238/22 brd 10.31.43.255 scope global dynamic noprefixroute eth0 valid_lft 3032sec preferred_lft 3032sec inet6 fe80::a0b7:fdc4:48e8:7158/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.40.1 dev eth0 proto dhcp src 10.31.41.238 metric 100 10.31.40.0/22 dev eth0 proto kernel scope link src 10.31.41.238 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 8454 1726882443.07633: no more pending results, returning what we have 8454 1726882443.07639: results queue empty 8454 1726882443.07640: checking for any_errors_fatal 8454 1726882443.07642: done checking for any_errors_fatal 8454 1726882443.07643: checking for max_fail_percentage 8454 1726882443.07645: done checking for max_fail_percentage 8454 1726882443.07647: checking to see if all hosts have failed and the running result is not ok 8454 1726882443.07648: done checking to see if all hosts have failed 8454 1726882443.07648: getting the remaining hosts for this loop 8454 1726882443.07651: done getting the remaining hosts for this loop 8454 1726882443.07656: getting the next task for host managed_node3 8454 1726882443.07664: done getting next task for host managed_node3 8454 1726882443.07667: ^ task is: TASK: Verify DNS and network connectivity 8454 1726882443.07672: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8454 1726882443.07682: getting variables 8454 1726882443.07684: in VariableManager get_vars() 8454 1726882443.07730: Calling all_inventory to load vars for managed_node3 8454 1726882443.07733: Calling groups_inventory to load vars for managed_node3 8454 1726882443.07963: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882443.07978: Calling all_plugins_play to load vars for managed_node3 8454 1726882443.07983: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882443.07987: Calling groups_plugins_play to load vars for managed_node3 8454 1726882443.12182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882443.17846: done with get_vars() 8454 1726882443.17894: done getting variables 8454 1726882443.17967: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:34:03 -0400 (0:00:00.580) 0:00:41.197 ****** 8454 1726882443.18009: entering _queue_task() for managed_node3/shell 8454 1726882443.18786: worker is 1 (out of 1 available) 8454 1726882443.18800: exiting _queue_task() for managed_node3/shell 8454 1726882443.18816: done queuing things up, now waiting for results queue to drain 8454 1726882443.18818: waiting for pending results... 8454 1726882443.19575: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 8454 1726882443.20045: in run() - task 0affe814-3a2d-f59f-16b9-00000000056e 8454 1726882443.20049: variable 'ansible_search_path' from source: unknown 8454 1726882443.20052: variable 'ansible_search_path' from source: unknown 8454 1726882443.20056: calling self._execute() 8454 1726882443.20149: variable 'ansible_host' from source: host vars for 'managed_node3' 8454 1726882443.20627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8454 1726882443.20631: variable 'omit' from source: magic vars 8454 1726882443.21640: variable 'ansible_distribution_major_version' from source: facts 8454 1726882443.21644: Evaluated conditional (ansible_distribution_major_version != '6'): True 8454 1726882443.21884: variable 'ansible_facts' from source: unknown 8454 1726882443.24214: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 8454 1726882443.24219: when evaluation is False, skipping this task 8454 1726882443.24222: _execute() done 8454 1726882443.24226: dumping result to json 8454 1726882443.24231: done dumping result, returning 8454 1726882443.24241: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0affe814-3a2d-f59f-16b9-00000000056e] 8454 1726882443.24248: sending task result for task 0affe814-3a2d-f59f-16b9-00000000056e 8454 1726882443.24740: done sending task result for task 0affe814-3a2d-f59f-16b9-00000000056e 8454 1726882443.24743: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 8454 1726882443.24799: no more pending results, returning what we have 8454 1726882443.24803: results queue empty 8454 1726882443.24804: checking for any_errors_fatal 8454 1726882443.24814: done checking for any_errors_fatal 8454 1726882443.24815: checking for max_fail_percentage 8454 1726882443.24817: done checking for max_fail_percentage 8454 1726882443.24818: checking to see if all hosts have failed and the running result is not ok 8454 1726882443.24819: done checking to see if all hosts have failed 8454 1726882443.24820: getting the remaining hosts for this loop 8454 1726882443.24822: done getting the remaining hosts for this loop 8454 1726882443.24826: getting the next task for host managed_node3 8454 1726882443.24839: done getting next task for host managed_node3 8454 1726882443.24842: ^ task is: TASK: meta (flush_handlers) 8454 1726882443.24844: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882443.24848: getting variables 8454 1726882443.24850: in VariableManager get_vars() 8454 1726882443.24892: Calling all_inventory to load vars for managed_node3 8454 1726882443.24896: Calling groups_inventory to load vars for managed_node3 8454 1726882443.24899: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882443.24911: Calling all_plugins_play to load vars for managed_node3 8454 1726882443.24915: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882443.24919: Calling groups_plugins_play to load vars for managed_node3 8454 1726882443.29380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882443.35905: done with get_vars() 8454 1726882443.36145: done getting variables 8454 1726882443.36241: in VariableManager get_vars() 8454 1726882443.36260: Calling all_inventory to load vars for managed_node3 8454 1726882443.36263: Calling groups_inventory to load vars for managed_node3 8454 1726882443.36266: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882443.36272: Calling all_plugins_play to load vars for managed_node3 8454 1726882443.36278: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882443.36282: Calling groups_plugins_play to load vars for managed_node3 8454 1726882443.40274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882443.46167: done with get_vars() 8454 1726882443.46219: done queuing things up, now waiting for results queue to drain 8454 1726882443.46222: results queue empty 8454 1726882443.46223: checking for any_errors_fatal 8454 1726882443.46227: done checking for any_errors_fatal 8454 1726882443.46228: checking for max_fail_percentage 8454 1726882443.46229: done checking for max_fail_percentage 8454 1726882443.46230: checking to see if all hosts have failed and the running result is not ok 8454 1726882443.46231: done checking to see if all hosts have failed 8454 1726882443.46232: getting the remaining hosts for this loop 8454 1726882443.46236: done getting the remaining hosts for this loop 8454 1726882443.46240: getting the next task for host managed_node3 8454 1726882443.46245: done getting next task for host managed_node3 8454 1726882443.46247: ^ task is: TASK: meta (flush_handlers) 8454 1726882443.46249: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882443.46252: getting variables 8454 1726882443.46254: in VariableManager get_vars() 8454 1726882443.46274: Calling all_inventory to load vars for managed_node3 8454 1726882443.46280: Calling groups_inventory to load vars for managed_node3 8454 1726882443.46283: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882443.46290: Calling all_plugins_play to load vars for managed_node3 8454 1726882443.46293: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882443.46297: Calling groups_plugins_play to load vars for managed_node3 8454 1726882443.48425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882443.51680: done with get_vars() 8454 1726882443.51713: done getting variables 8454 1726882443.51983: in VariableManager get_vars() 8454 1726882443.52001: Calling all_inventory to load vars for managed_node3 8454 1726882443.52004: Calling groups_inventory to load vars for managed_node3 8454 1726882443.52007: Calling all_plugins_inventory to load vars for managed_node3 8454 1726882443.52013: Calling all_plugins_play to load vars for managed_node3 8454 1726882443.52016: Calling groups_plugins_inventory to load vars for managed_node3 8454 1726882443.52019: Calling groups_plugins_play to load vars for managed_node3 8454 1726882443.55992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8454 1726882443.62050: done with get_vars() 8454 1726882443.62096: done queuing things up, now waiting for results queue to drain 8454 1726882443.62099: results queue empty 8454 1726882443.62100: checking for any_errors_fatal 8454 1726882443.62102: done checking for any_errors_fatal 8454 1726882443.62103: checking for max_fail_percentage 8454 1726882443.62104: done checking for max_fail_percentage 8454 1726882443.62105: checking to see if all hosts have failed and the running result is not ok 8454 1726882443.62106: done checking to see if all hosts have failed 8454 1726882443.62107: getting the remaining hosts for this loop 8454 1726882443.62108: done getting the remaining hosts for this loop 8454 1726882443.62118: getting the next task for host managed_node3 8454 1726882443.62122: done getting next task for host managed_node3 8454 1726882443.62123: ^ task is: None 8454 1726882443.62125: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8454 1726882443.62127: done queuing things up, now waiting for results queue to drain 8454 1726882443.62128: results queue empty 8454 1726882443.62129: checking for any_errors_fatal 8454 1726882443.62130: done checking for any_errors_fatal 8454 1726882443.62131: checking for max_fail_percentage 8454 1726882443.62132: done checking for max_fail_percentage 8454 1726882443.62336: checking to see if all hosts have failed and the running result is not ok 8454 1726882443.62338: done checking to see if all hosts have failed 8454 1726882443.62342: getting the next task for host managed_node3 8454 1726882443.62346: done getting next task for host managed_node3 8454 1726882443.62348: ^ task is: None 8454 1726882443.62349: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=75 changed=2 unreachable=0 failed=0 skipped=61 rescued=0 ignored=0 Friday 20 September 2024 21:34:03 -0400 (0:00:00.444) 0:00:41.642 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.34s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.21s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install dnsmasq --------------------------------------------------------- 2.18s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Gathering Facts --------------------------------------------------------- 1.97s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Create test interfaces -------------------------------------------------- 1.84s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Install pgrep, sysctl --------------------------------------------------- 1.81s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.54s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.34s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.06s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.06s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.03s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.99s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.85s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.80s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.75s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.69s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 ** TEST check IPv6 ------------------------------------------------------ 0.69s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Remove test interfaces -------------------------------------------------- 0.62s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Stat profile file ------------------------------------------------------- 0.60s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Stop dnsmasq/radvd services --------------------------------------------- 0.60s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 8454 1726882443.62505: RUNNING CLEANUP