44109 1727204222.38677: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 44109 1727204222.39215: Added group all to inventory 44109 1727204222.39217: Added group ungrouped to inventory 44109 1727204222.39222: Group all now contains ungrouped 44109 1727204222.39225: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 44109 1727204222.72348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 44109 1727204222.72617: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 44109 1727204222.72644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 44109 1727204222.72707: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 44109 1727204222.73087: Loaded config def from plugin (inventory/script) 44109 1727204222.73090: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 44109 1727204222.73134: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 44109 1727204222.73227: Loaded config def from plugin (inventory/yaml) 44109 1727204222.73229: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 44109 1727204222.73526: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 44109 1727204222.74371: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 44109 1727204222.74375: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 44109 1727204222.74380: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 44109 1727204222.74387: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 44109 1727204222.74392: Loading data from /tmp/network-zt6/inventory-rSl.yml 44109 1727204222.74460: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 44109 1727204222.74665: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 44109 1727204222.74811: Loading data from /tmp/network-zt6/inventory-rSl.yml 44109 1727204222.75100: group all already in inventory 44109 1727204222.75108: set inventory_file for managed-node1 44109 1727204222.75112: set inventory_dir for managed-node1 44109 1727204222.75113: Added host managed-node1 to inventory 44109 1727204222.75116: Added host managed-node1 to group all 44109 1727204222.75117: set ansible_host for managed-node1 44109 1727204222.75117: set ansible_ssh_extra_args for managed-node1 44109 1727204222.75120: set inventory_file for managed-node2 44109 1727204222.75123: set inventory_dir for managed-node2 44109 1727204222.75124: Added host managed-node2 to inventory 44109 1727204222.75125: Added host managed-node2 to group all 44109 1727204222.75126: set ansible_host for managed-node2 44109 1727204222.75127: set ansible_ssh_extra_args for managed-node2 44109 1727204222.75129: set inventory_file for managed-node3 44109 1727204222.75131: set inventory_dir for managed-node3 44109 1727204222.75132: Added host managed-node3 to inventory 44109 1727204222.75133: Added host managed-node3 to group all 44109 1727204222.75134: set ansible_host for managed-node3 44109 1727204222.75134: set ansible_ssh_extra_args for managed-node3 44109 1727204222.75137: Reconcile groups and hosts in inventory. 44109 1727204222.75141: Group ungrouped now contains managed-node1 44109 1727204222.75142: Group ungrouped now contains managed-node2 44109 1727204222.75144: Group ungrouped now contains managed-node3 44109 1727204222.75228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 44109 1727204222.75561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 44109 1727204222.75610: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 44109 1727204222.75639: Loaded config def from plugin (vars/host_group_vars) 44109 1727204222.75641: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 44109 1727204222.75648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 44109 1727204222.75657: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 44109 1727204222.75809: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 44109 1727204222.76456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204222.76659: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 44109 1727204222.76702: Loaded config def from plugin (connection/local) 44109 1727204222.76706: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 44109 1727204222.78138: Loaded config def from plugin (connection/paramiko_ssh) 44109 1727204222.78143: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 44109 1727204222.80444: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44109 1727204222.80489: Loaded config def from plugin (connection/psrp) 44109 1727204222.80493: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 44109 1727204222.82419: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44109 1727204222.82462: Loaded config def from plugin (connection/ssh) 44109 1727204222.82466: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 44109 1727204222.86869: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44109 1727204222.86913: Loaded config def from plugin (connection/winrm) 44109 1727204222.86917: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 44109 1727204222.86948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 44109 1727204222.87221: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 44109 1727204222.87289: Loaded config def from plugin (shell/cmd) 44109 1727204222.87292: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 44109 1727204222.87319: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 44109 1727204222.87687: Loaded config def from plugin (shell/powershell) 44109 1727204222.87690: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 44109 1727204222.87744: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 44109 1727204222.88120: Loaded config def from plugin (shell/sh) 44109 1727204222.88122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 44109 1727204222.88157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 44109 1727204222.88271: Loaded config def from plugin (become/runas) 44109 1727204222.88274: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 44109 1727204222.88657: Loaded config def from plugin (become/su) 44109 1727204222.88660: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 44109 1727204222.88917: Loaded config def from plugin (become/sudo) 44109 1727204222.88919: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 44109 1727204222.88953: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 44109 1727204222.89798: in VariableManager get_vars() 44109 1727204222.89820: done with get_vars() 44109 1727204222.90155: trying /usr/local/lib/python3.12/site-packages/ansible/modules 44109 1727204222.96222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 44109 1727204222.96544: in VariableManager get_vars() 44109 1727204222.96550: done with get_vars() 44109 1727204222.96553: variable 'playbook_dir' from source: magic vars 44109 1727204222.96554: variable 'ansible_playbook_python' from source: magic vars 44109 1727204222.96554: variable 'ansible_config_file' from source: magic vars 44109 1727204222.96555: variable 'groups' from source: magic vars 44109 1727204222.96556: variable 'omit' from source: magic vars 44109 1727204222.96557: variable 'ansible_version' from source: magic vars 44109 1727204222.96557: variable 'ansible_check_mode' from source: magic vars 44109 1727204222.96558: variable 'ansible_diff_mode' from source: magic vars 44109 1727204222.96559: variable 'ansible_forks' from source: magic vars 44109 1727204222.96559: variable 'ansible_inventory_sources' from source: magic vars 44109 1727204222.96560: variable 'ansible_skip_tags' from source: magic vars 44109 1727204222.96561: variable 'ansible_limit' from source: magic vars 44109 1727204222.96561: variable 'ansible_run_tags' from source: magic vars 44109 1727204222.96562: variable 'ansible_verbosity' from source: magic vars 44109 1727204222.96602: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml 44109 1727204222.98648: in VariableManager get_vars() 44109 1727204222.98667: done with get_vars() 44109 1727204222.98912: in VariableManager get_vars() 44109 1727204222.98928: done with get_vars() 44109 1727204222.98964: in VariableManager get_vars() 44109 1727204222.98979: done with get_vars() 44109 1727204222.99061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 44109 1727204222.99388: in VariableManager get_vars() 44109 1727204222.99404: done with get_vars() 44109 1727204222.99409: variable 'omit' from source: magic vars 44109 1727204222.99428: variable 'omit' from source: magic vars 44109 1727204222.99463: in VariableManager get_vars() 44109 1727204222.99475: done with get_vars() 44109 1727204222.99730: in VariableManager get_vars() 44109 1727204222.99744: done with get_vars() 44109 1727204222.99782: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44109 1727204223.00109: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44109 1727204223.00518: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44109 1727204223.01265: in VariableManager get_vars() 44109 1727204223.01291: done with get_vars() 44109 1727204223.01766: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44109 1727204223.09222: in VariableManager get_vars() 44109 1727204223.09226: done with get_vars() 44109 1727204223.09229: variable 'playbook_dir' from source: magic vars 44109 1727204223.09230: variable 'ansible_playbook_python' from source: magic vars 44109 1727204223.09230: variable 'ansible_config_file' from source: magic vars 44109 1727204223.09231: variable 'groups' from source: magic vars 44109 1727204223.09232: variable 'omit' from source: magic vars 44109 1727204223.09233: variable 'ansible_version' from source: magic vars 44109 1727204223.09234: variable 'ansible_check_mode' from source: magic vars 44109 1727204223.09234: variable 'ansible_diff_mode' from source: magic vars 44109 1727204223.09235: variable 'ansible_forks' from source: magic vars 44109 1727204223.09236: variable 'ansible_inventory_sources' from source: magic vars 44109 1727204223.09237: variable 'ansible_skip_tags' from source: magic vars 44109 1727204223.09237: variable 'ansible_limit' from source: magic vars 44109 1727204223.09238: variable 'ansible_run_tags' from source: magic vars 44109 1727204223.09239: variable 'ansible_verbosity' from source: magic vars 44109 1727204223.09274: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 44109 1727204223.09556: in VariableManager get_vars() 44109 1727204223.09560: done with get_vars() 44109 1727204223.09563: variable 'playbook_dir' from source: magic vars 44109 1727204223.09564: variable 'ansible_playbook_python' from source: magic vars 44109 1727204223.09564: variable 'ansible_config_file' from source: magic vars 44109 1727204223.09565: variable 'groups' from source: magic vars 44109 1727204223.09566: variable 'omit' from source: magic vars 44109 1727204223.09567: variable 'ansible_version' from source: magic vars 44109 1727204223.09567: variable 'ansible_check_mode' from source: magic vars 44109 1727204223.09568: variable 'ansible_diff_mode' from source: magic vars 44109 1727204223.09569: variable 'ansible_forks' from source: magic vars 44109 1727204223.09570: variable 'ansible_inventory_sources' from source: magic vars 44109 1727204223.09570: variable 'ansible_skip_tags' from source: magic vars 44109 1727204223.09571: variable 'ansible_limit' from source: magic vars 44109 1727204223.09572: variable 'ansible_run_tags' from source: magic vars 44109 1727204223.09572: variable 'ansible_verbosity' from source: magic vars 44109 1727204223.09607: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 44109 1727204223.09878: in VariableManager get_vars() 44109 1727204223.09894: done with get_vars() 44109 1727204223.09939: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44109 1727204223.10066: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44109 1727204223.10252: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44109 1727204223.10918: in VariableManager get_vars() 44109 1727204223.10949: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44109 1727204223.12607: in VariableManager get_vars() 44109 1727204223.12623: done with get_vars() 44109 1727204223.12661: in VariableManager get_vars() 44109 1727204223.12664: done with get_vars() 44109 1727204223.12666: variable 'playbook_dir' from source: magic vars 44109 1727204223.12667: variable 'ansible_playbook_python' from source: magic vars 44109 1727204223.12668: variable 'ansible_config_file' from source: magic vars 44109 1727204223.12669: variable 'groups' from source: magic vars 44109 1727204223.12670: variable 'omit' from source: magic vars 44109 1727204223.12671: variable 'ansible_version' from source: magic vars 44109 1727204223.12671: variable 'ansible_check_mode' from source: magic vars 44109 1727204223.12672: variable 'ansible_diff_mode' from source: magic vars 44109 1727204223.12673: variable 'ansible_forks' from source: magic vars 44109 1727204223.12673: variable 'ansible_inventory_sources' from source: magic vars 44109 1727204223.12674: variable 'ansible_skip_tags' from source: magic vars 44109 1727204223.12680: variable 'ansible_limit' from source: magic vars 44109 1727204223.12681: variable 'ansible_run_tags' from source: magic vars 44109 1727204223.12681: variable 'ansible_verbosity' from source: magic vars 44109 1727204223.12716: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 44109 1727204223.12809: in VariableManager get_vars() 44109 1727204223.12822: done with get_vars() 44109 1727204223.12862: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44109 1727204223.12980: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44109 1727204223.13058: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44109 1727204223.13471: in VariableManager get_vars() 44109 1727204223.13493: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44109 1727204223.15367: in VariableManager get_vars() 44109 1727204223.15488: done with get_vars() 44109 1727204223.15532: in VariableManager get_vars() 44109 1727204223.15544: done with get_vars() 44109 1727204223.15581: in VariableManager get_vars() 44109 1727204223.15592: done with get_vars() 44109 1727204223.15771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 44109 1727204223.15853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 44109 1727204223.16330: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 44109 1727204223.16713: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 44109 1727204223.16716: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 44109 1727204223.16749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 44109 1727204223.16774: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 44109 1727204223.17118: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 44109 1727204223.17380: Loaded config def from plugin (callback/default) 44109 1727204223.17383: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 44109 1727204223.19786: Loaded config def from plugin (callback/junit) 44109 1727204223.19790: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 44109 1727204223.19844: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 44109 1727204223.20057: Loaded config def from plugin (callback/minimal) 44109 1727204223.20060: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 44109 1727204223.20102: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 44109 1727204223.20211: Loaded config def from plugin (callback/tree) 44109 1727204223.20213: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 44109 1727204223.20442: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 44109 1727204223.20444: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_routing_rules_nm.yml ******************************************* 6 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 44109 1727204223.20588: in VariableManager get_vars() 44109 1727204223.20604: done with get_vars() 44109 1727204223.20609: in VariableManager get_vars() 44109 1727204223.20618: done with get_vars() 44109 1727204223.20622: variable 'omit' from source: magic vars 44109 1727204223.20657: in VariableManager get_vars() 44109 1727204223.20779: done with get_vars() 44109 1727204223.20803: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_routing_rules.yml' with nm as provider] **** 44109 1727204223.21503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 44109 1727204223.21588: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 44109 1727204223.21621: getting the remaining hosts for this loop 44109 1727204223.21623: done getting the remaining hosts for this loop 44109 1727204223.21631: getting the next task for host managed-node1 44109 1727204223.21635: done getting next task for host managed-node1 44109 1727204223.21637: ^ task is: TASK: Gathering Facts 44109 1727204223.21639: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204223.21649: getting variables 44109 1727204223.21650: in VariableManager get_vars() 44109 1727204223.21661: Calling all_inventory to load vars for managed-node1 44109 1727204223.21664: Calling groups_inventory to load vars for managed-node1 44109 1727204223.21666: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204223.21683: Calling all_plugins_play to load vars for managed-node1 44109 1727204223.21695: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204223.21699: Calling groups_plugins_play to load vars for managed-node1 44109 1727204223.21732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204223.21794: done with get_vars() 44109 1727204223.21801: done getting variables 44109 1727204223.22015: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Tuesday 24 September 2024 14:57:03 -0400 (0:00:00.016) 0:00:00.016 ***** 44109 1727204223.22037: entering _queue_task() for managed-node1/gather_facts 44109 1727204223.22039: Creating lock for gather_facts 44109 1727204223.22451: worker is 1 (out of 1 available) 44109 1727204223.22461: exiting _queue_task() for managed-node1/gather_facts 44109 1727204223.22579: done queuing things up, now waiting for results queue to drain 44109 1727204223.22581: waiting for pending results... 44109 1727204223.22720: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44109 1727204223.22958: in run() - task 028d2410-947f-ed67-a560-0000000000af 44109 1727204223.22961: variable 'ansible_search_path' from source: unknown 44109 1727204223.22964: calling self._execute() 44109 1727204223.22974: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204223.22987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204223.23000: variable 'omit' from source: magic vars 44109 1727204223.23114: variable 'omit' from source: magic vars 44109 1727204223.23147: variable 'omit' from source: magic vars 44109 1727204223.23199: variable 'omit' from source: magic vars 44109 1727204223.23247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204223.23298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204223.23321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204223.23344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204223.23360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204223.23405: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204223.23415: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204223.23424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204223.23540: Set connection var ansible_connection to ssh 44109 1727204223.23552: Set connection var ansible_timeout to 10 44109 1727204223.23562: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204223.23573: Set connection var ansible_pipelining to False 44109 1727204223.23585: Set connection var ansible_shell_executable to /bin/sh 44109 1727204223.23594: Set connection var ansible_shell_type to sh 44109 1727204223.23630: variable 'ansible_shell_executable' from source: unknown 44109 1727204223.23680: variable 'ansible_connection' from source: unknown 44109 1727204223.23684: variable 'ansible_module_compression' from source: unknown 44109 1727204223.23686: variable 'ansible_shell_type' from source: unknown 44109 1727204223.23688: variable 'ansible_shell_executable' from source: unknown 44109 1727204223.23691: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204223.23693: variable 'ansible_pipelining' from source: unknown 44109 1727204223.23695: variable 'ansible_timeout' from source: unknown 44109 1727204223.23698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204223.23879: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204223.23896: variable 'omit' from source: magic vars 44109 1727204223.23906: starting attempt loop 44109 1727204223.23936: running the handler 44109 1727204223.23942: variable 'ansible_facts' from source: unknown 44109 1727204223.23970: _low_level_execute_command(): starting 44109 1727204223.24046: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204223.25129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204223.25157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204223.25184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204223.25300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204223.27232: stdout chunk (state=3): >>>/root <<< 44109 1727204223.27301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204223.27483: stdout chunk (state=3): >>><<< 44109 1727204223.27486: stderr chunk (state=3): >>><<< 44109 1727204223.27489: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204223.27492: _low_level_execute_command(): starting 44109 1727204223.27495: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600 `" && echo ansible-tmp-1727204223.2742577-44248-202540709669600="` echo /root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600 `" ) && sleep 0' 44109 1727204223.28734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204223.28749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204223.28793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204223.29031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204223.29092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204223.29272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204223.31406: stdout chunk (state=3): >>>ansible-tmp-1727204223.2742577-44248-202540709669600=/root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600 <<< 44109 1727204223.31504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204223.31543: stderr chunk (state=3): >>><<< 44109 1727204223.31552: stdout chunk (state=3): >>><<< 44109 1727204223.31603: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204223.2742577-44248-202540709669600=/root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204223.31885: variable 'ansible_module_compression' from source: unknown 44109 1727204223.31889: ANSIBALLZ: Using generic lock for ansible.legacy.setup 44109 1727204223.31891: ANSIBALLZ: Acquiring lock 44109 1727204223.31893: ANSIBALLZ: Lock acquired: 139907468546112 44109 1727204223.31895: ANSIBALLZ: Creating module 44109 1727204223.90597: ANSIBALLZ: Writing module into payload 44109 1727204223.91065: ANSIBALLZ: Writing module 44109 1727204223.91365: ANSIBALLZ: Renaming module 44109 1727204223.91369: ANSIBALLZ: Done creating module 44109 1727204223.91412: variable 'ansible_facts' from source: unknown 44109 1727204223.91455: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204223.91458: _low_level_execute_command(): starting 44109 1727204223.91461: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 44109 1727204223.92633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204223.92739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204223.92743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204223.92746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204223.92848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204223.92852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204223.92962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204223.93174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204223.94957: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 44109 1727204223.94979: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 44109 1727204223.95347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204223.95356: stdout chunk (state=3): >>><<< 44109 1727204223.95359: stderr chunk (state=3): >>><<< 44109 1727204223.95681: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204223.95687 [managed-node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 44109 1727204223.95691: _low_level_execute_command(): starting 44109 1727204223.95693: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 44109 1727204223.95978: Sending initial data 44109 1727204223.95981: Sent initial data (1181 bytes) 44109 1727204223.97094: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204223.97199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204223.97683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204223.97687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204224.01136: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 44109 1727204224.01599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204224.01603: stdout chunk (state=3): >>><<< 44109 1727204224.01606: stderr chunk (state=3): >>><<< 44109 1727204224.01631: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204224.01812: variable 'ansible_facts' from source: unknown 44109 1727204224.01818: variable 'ansible_facts' from source: unknown 44109 1727204224.01829: variable 'ansible_module_compression' from source: unknown 44109 1727204224.01870: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44109 1727204224.02039: variable 'ansible_facts' from source: unknown 44109 1727204224.02409: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/AnsiballZ_setup.py 44109 1727204224.02863: Sending initial data 44109 1727204224.02866: Sent initial data (154 bytes) 44109 1727204224.04074: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204224.04191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204224.04382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204224.04386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204224.04400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204224.04512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204224.06262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204224.06355: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204224.06486: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpa3adyqi1 /root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/AnsiballZ_setup.py <<< 44109 1727204224.06489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/AnsiballZ_setup.py" <<< 44109 1727204224.06648: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpa3adyqi1" to remote "/root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/AnsiballZ_setup.py" <<< 44109 1727204224.10160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204224.10164: stdout chunk (state=3): >>><<< 44109 1727204224.10166: stderr chunk (state=3): >>><<< 44109 1727204224.10293: done transferring module to remote 44109 1727204224.10307: _low_level_execute_command(): starting 44109 1727204224.10311: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/ /root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/AnsiballZ_setup.py && sleep 0' 44109 1727204224.11497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204224.11682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204224.11692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204224.11694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204224.11707: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204224.11847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204224.11963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44109 1727204224.14394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204224.14405: stdout chunk (state=3): >>><<< 44109 1727204224.14422: stderr chunk (state=3): >>><<< 44109 1727204224.14450: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44109 1727204224.14629: _low_level_execute_command(): starting 44109 1727204224.14633: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/AnsiballZ_setup.py && sleep 0' 44109 1727204224.15897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204224.16029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204224.16033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204224.16331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204224.18849: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 44109 1727204224.18857: stdout chunk (state=3): >>>import _imp # builtin <<< 44109 1727204224.18877: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 44109 1727204224.18906: stdout chunk (state=3): >>>import 'posix' # <<< 44109 1727204224.18948: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 44109 1727204224.19086: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 44109 1727204224.19090: stdout chunk (state=3): >>>import 'codecs' # <<< 44109 1727204224.19183: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83adfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 44109 1727204224.19200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83b12a50> <<< 44109 1727204224.19222: stdout chunk (state=3): >>>import '_signal' # <<< 44109 1727204224.19245: stdout chunk (state=3): >>>import '_abc' # <<< 44109 1727204224.19292: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 44109 1727204224.19314: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 44109 1727204224.19536: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 44109 1727204224.19564: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 44109 1727204224.19567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 44109 1727204224.19656: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b838e5130> <<< 44109 1727204224.19660: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 44109 1727204224.19933: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b838e6060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 44109 1727204224.20382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83923e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83923f50> <<< 44109 1727204224.20397: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 44109 1727204224.20421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 44109 1727204224.20494: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 44109 1727204224.20503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204224.20546: stdout chunk (state=3): >>>import 'itertools' # <<< 44109 1727204224.20550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 44109 1727204224.20556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8395b830> <<< 44109 1727204224.20661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 44109 1727204224.20664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8395bec0> <<< 44109 1727204224.20666: stdout chunk (state=3): >>>import '_collections' # <<< 44109 1727204224.20668: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8393bb60> <<< 44109 1727204224.20670: stdout chunk (state=3): >>>import '_functools' # <<< 44109 1727204224.20700: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83939280> <<< 44109 1727204224.20792: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83921040> <<< 44109 1727204224.20860: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 44109 1727204224.20872: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 44109 1727204224.20908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 44109 1727204224.20980: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8397f800> <<< 44109 1727204224.20986: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8397e420> <<< 44109 1727204224.21018: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 44109 1727204224.21021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8393a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8397cc20> <<< 44109 1727204224.21155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839202c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 44109 1727204224.21158: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839b0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b0bf0> <<< 44109 1727204224.21205: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204224.21208: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204224.21587: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839b0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8391ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839c8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839c9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 44109 1727204224.21631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 44109 1727204224.21634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 44109 1727204224.21636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 44109 1727204224.21638: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839cacc0> <<< 44109 1727204224.21690: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204224.21697: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839ca210> <<< 44109 1727204224.21781: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 44109 1727204224.21785: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839cbd70> <<< 44109 1727204224.21867: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839cb4a0> <<< 44109 1727204224.21874: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 44109 1727204224.21880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 44109 1727204224.21898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 44109 1727204224.21979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836c7c50> <<< 44109 1727204224.21986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 44109 1727204224.22019: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204224.22026: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836f0770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f04d0> <<< 44109 1727204224.22067: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836f06b0> <<< 44109 1727204224.22403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836f1070> <<< 44109 1727204224.22617: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836f19a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f0920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836c5df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f2d80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f1ac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 44109 1727204224.22740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204224.22744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 44109 1727204224.22747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 44109 1727204224.22839: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8371f110> <<< 44109 1727204224.22845: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204224.22856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 44109 1727204224.22879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 44109 1727204224.22957: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837434d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 44109 1727204224.22994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 44109 1727204224.23062: stdout chunk (state=3): >>>import 'ntpath' # <<< 44109 1727204224.23295: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837a0230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837a2990> <<< 44109 1727204224.23365: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837a0350> <<< 44109 1727204224.23430: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83769220> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 44109 1727204224.23449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b835a1340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837422d0> <<< 44109 1727204224.23526: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f3ce0> <<< 44109 1727204224.23740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8b837428d0> <<< 44109 1727204224.23926: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_3m_0209k/ansible_ansible.legacy.setup_payload.zip' <<< 44109 1727204224.23930: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.24304: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836070b0> <<< 44109 1727204224.24506: stdout chunk (state=3): >>>import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b835e5fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b835e5130> # zipimport: zlib available import 'ansible' # <<< 44109 1727204224.24530: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.24557: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.24851: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 44109 1727204224.26235: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.27395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836053a0> <<< 44109 1727204224.27516: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 44109 1727204224.27547: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b8363aa20> <<< 44109 1727204224.27550: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8363a7b0> <<< 44109 1727204224.27584: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8363a0c0> <<< 44109 1727204224.27603: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 44109 1727204224.27658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8363a600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83b129c0> <<< 44109 1727204224.27668: stdout chunk (state=3): >>>import 'atexit' # <<< 44109 1727204224.27740: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b8363b740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b8363b950> <<< 44109 1727204224.27758: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 44109 1727204224.27816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 44109 1727204224.27828: stdout chunk (state=3): >>>import '_locale' # <<< 44109 1727204224.27880: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8363be60> import 'pwd' # <<< 44109 1727204224.27908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 44109 1727204224.27986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 44109 1727204224.28013: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f25c40> <<< 44109 1727204224.28017: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204224.28019: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f27860> <<< 44109 1727204224.28064: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 44109 1727204224.28095: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2c260> <<< 44109 1727204224.28112: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 44109 1727204224.28146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 44109 1727204224.28192: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2d400> <<< 44109 1727204224.28195: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 44109 1727204224.28284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 44109 1727204224.28312: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2fef0> <<< 44109 1727204224.28353: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839cab40> <<< 44109 1727204224.28377: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2e1b0> <<< 44109 1727204224.28393: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 44109 1727204224.28452: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 44109 1727204224.28456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 44109 1727204224.28459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 44109 1727204224.28504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 44109 1727204224.28897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f33e90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f32960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f326c0> <<< 44109 1727204224.28901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 44109 1727204224.28905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f32c30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f78140> <<< 44109 1727204224.28933: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f782c0> <<< 44109 1727204224.28949: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 44109 1727204224.28996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 44109 1727204224.28999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 44109 1727204224.29001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 44109 1727204224.29040: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f79d30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f79af0> <<< 44109 1727204224.29154: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 44109 1727204224.29259: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f7c2f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f7a420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 44109 1727204224.29273: stdout chunk (state=3): >>>import '_string' # <<< 44109 1727204224.29304: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f7fad0> <<< 44109 1727204224.29606: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f7c4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f80b00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f7d010> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f80bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f78470> <<< 44109 1727204224.29626: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 44109 1727204224.29710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 44109 1727204224.29730: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204224.29743: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82e0c350> <<< 44109 1727204224.30009: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82e0d4f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f82ae0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f83e90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f82750> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 44109 1727204224.30036: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.30144: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.30259: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 44109 1727204224.30267: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.30390: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.30509: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.31199: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.31758: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 44109 1727204224.31765: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 44109 1727204224.31789: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82e11580> <<< 44109 1727204224.31874: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 44109 1727204224.31919: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e12300> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83607020> <<< 44109 1727204224.32022: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 44109 1727204224.32025: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.32224: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.32498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e12150> # zipimport: zlib available <<< 44109 1727204224.32846: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.33414: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.33428: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.33479: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 44109 1727204224.33489: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.33533: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.33645: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 44109 1727204224.33690: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.33727: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 44109 1727204224.33765: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 44109 1727204224.33866: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 44109 1727204224.34116: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.34523: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e134d0> # zipimport: zlib available <<< 44109 1727204224.34614: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.34735: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 44109 1727204224.34769: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.34862: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 44109 1727204224.34865: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.34962: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.34966: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.35044: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 44109 1727204224.35175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204224.35403: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82e1dee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e18e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.35526: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 44109 1727204224.35600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 44109 1727204224.35632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 44109 1727204224.35635: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 44109 1727204224.35655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 44109 1727204224.35708: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f06a20> <<< 44109 1727204224.35879: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82ffe6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e1e1b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e1e000> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.36015: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 44109 1727204224.36019: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 44109 1727204224.36021: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.36093: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.36205: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.36396: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 44109 1727204224.36434: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.36521: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.36553: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.36581: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 44109 1727204224.39478: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb2300> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af0140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82af04a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e9b1a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb2e40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb0aa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb1340> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82af3440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af2cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82af2ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af2120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af3500> # /usr/lib64/python3.12/m<<< 44109 1727204224.39520: stdout chunk (state=3): >>>ultiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82b55fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af3fe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb06b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 44109 1727204224.39962: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.40651: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.40710: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.40793: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.40821: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 44109 1727204224.40824: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.40870: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.40920: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 44109 1727204224.40936: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.40993: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.41078: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 44109 1727204224.41105: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.41170: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 44109 1727204224.41238: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.41270: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 44109 1727204224.41286: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.41432: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.41596: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py<<< 44109 1727204224.41599: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 44109 1727204224.41661: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82b578c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 44109 1727204224.41724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 44109 1727204224.42021: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82b56c90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 44109 1727204224.42185: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # <<< 44109 1727204224.42214: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.42234: stdout chunk (state=3): >>> <<< 44109 1727204224.42384: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.42511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 44109 1727204224.42518: stdout chunk (state=3): >>> <<< 44109 1727204224.42541: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.42546: stdout chunk (state=3): >>> <<< 44109 1727204224.42673: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.42682: stdout chunk (state=3): >>> <<< 44109 1727204224.42800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 44109 1727204224.42897: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.42975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 44109 1727204224.43079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 44109 1727204224.43187: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204224.43297: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82b8e2a0> <<< 44109 1727204224.43689: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82b7e0f0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 44109 1727204224.43808: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.43840: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 44109 1727204224.43845: stdout chunk (state=3): >>> <<< 44109 1727204224.43869: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.44014: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.44028: stdout chunk (state=3): >>> <<< 44109 1727204224.44197: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.44391: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.44568: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 44109 1727204224.44599: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.service_mgr' # <<< 44109 1727204224.44626: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204224.44760: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 44109 1727204224.44773: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.44849: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.44905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py<<< 44109 1727204224.44927: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 44109 1727204224.45025: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204224.45029: stdout chunk (state=3): >>> <<< 44109 1727204224.45042: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82ba5b80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82ba5be0> import 'ansible.module_utils.facts.system.user' # <<< 44109 1727204224.45079: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204224.45100: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.45133: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 44109 1727204224.45326: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 44109 1727204224.45552: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.45891: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 44109 1727204224.45895: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204224.46095: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.46124: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.46196: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.46222: stdout chunk (state=3): >>> <<< 44109 1727204224.46258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 44109 1727204224.46289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 44109 1727204224.46312: stdout chunk (state=3): >>> <<< 44109 1727204224.46320: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.46359: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.46413: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.46642: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.46870: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 44109 1727204224.46898: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.47111: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.47238: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 44109 1727204224.47249: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.47284: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.47325: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.47961: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.48910: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.49029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 44109 1727204224.49062: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.49257: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.49399: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware.openbsd' # <<< 44109 1727204224.49431: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.49434: stdout chunk (state=3): >>> <<< 44109 1727204224.49694: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.49700: stdout chunk (state=3): >>> <<< 44109 1727204224.49952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 44109 1727204224.49967: stdout chunk (state=3): >>> <<< 44109 1727204224.49991: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.50005: stdout chunk (state=3): >>> <<< 44109 1727204224.50023: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.50043: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network' # <<< 44109 1727204224.50058: stdout chunk (state=3): >>> <<< 44109 1727204224.50086: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.50100: stdout chunk (state=3): >>> <<< 44109 1727204224.50163: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.50178: stdout chunk (state=3): >>> <<< 44109 1727204224.50251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available<<< 44109 1727204224.50424: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204224.50594: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.50890: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.51252: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 44109 1727204224.51352: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.51415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 44109 1727204224.51418: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.51444: stdout chunk (state=3): >>> <<< 44109 1727204224.51538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 44109 1727204224.51542: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.51553: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.51579: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 44109 1727204224.51594: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.51658: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.51744: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 44109 1727204224.51765: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.51819: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.51886: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 44109 1727204224.51889: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.52182: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.52439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 44109 1727204224.52466: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.52761: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.52791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 44109 1727204224.52846: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204224.52889: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 44109 1727204224.52903: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53025: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53141: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 44109 1727204224.53148: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53178: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 44109 1727204224.53197: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53308: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # <<< 44109 1727204224.53319: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53350: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53370: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53443: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53509: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.53625: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.53630: stdout chunk (state=3): >>> <<< 44109 1727204224.53754: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 44109 1727204224.53770: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 44109 1727204224.53795: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204224.53947: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 44109 1727204224.53977: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.54348: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.54689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 44109 1727204224.54694: stdout chunk (state=3): >>> <<< 44109 1727204224.54723: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.54874: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # <<< 44109 1727204224.54904: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.54909: stdout chunk (state=3): >>> <<< 44109 1727204224.54986: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.55066: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 44109 1727204224.55096: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.55235: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.55369: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.sunos' # <<< 44109 1727204224.55385: stdout chunk (state=3): >>> <<< 44109 1727204224.55403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 44109 1727204224.55431: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204224.55436: stdout chunk (state=3): >>> <<< 44109 1727204224.55577: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.55713: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 44109 1727204224.55780: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204224.56821: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b8293a780> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82939310> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82b7f200> <<< 44109 1727204224.81814: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 44109 1727204224.81886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 44109 1727204224.81894: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82983200> <<< 44109 1727204224.81909: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 44109 1727204224.81930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 44109 1727204224.81965: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82981250> <<< 44109 1727204224.82052: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204224.82093: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82983500> <<< 44109 1727204224.82144: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82982180> <<< 44109 1727204224.82486: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 44109 1727204225.09650: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "04", "epoch": "1727204224", "epoch_int": "1727204224", "date": "2024-09-24", "time": "14:57:04", "iso8601_micro": "2024-09-24T18:57:04.568276Z", "iso8601": "2024-09-24T18:57:04Z", "iso8601_basic": "20240924T145704568276", "iso8601_basic_short": "20240924T145704", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2923, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 608, "free": 2923}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ans<<< 44109 1727204225.09685: stdout chunk (state=3): >>>ible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 815, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785198592, "block_size": 4096, "block_total": 65519099, "block_available": 63912402, "block_used": 1606697, "inode_total": 131070960, "inode_available": 131027257, "inode_used": 43703, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.47021484375, "5m": 0.509765625, "15m": 0.29931640625}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44109 1727204225.10552: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 44109 1727204225.10592: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ <<< 44109 1727204225.10633: stdout chunk (state=3): >>># clear sys.path # clear sys.argv <<< 44109 1727204225.10636: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_exc<<< 44109 1727204225.10665: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings<<< 44109 1727204225.10691: stdout chunk (state=3): >>> # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs<<< 44109 1727204225.10731: stdout chunk (state=3): >>> # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath<<< 44109 1727204225.10741: stdout chunk (state=3): >>> # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools <<< 44109 1727204225.10804: stdout chunk (state=3): >>># cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser<<< 44109 1727204225.11083: stdout chunk (state=3): >>> # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno<<< 44109 1727204225.11097: stdout chunk (state=3): >>> # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress<<< 44109 1727204225.11127: stdout chunk (state=3): >>> # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters <<< 44109 1727204225.11268: stdout chunk (state=3): >>># destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual <<< 44109 1727204225.11316: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd<<< 44109 1727204225.11337: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd <<< 44109 1727204225.11360: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy<<< 44109 1727204225.11504: stdout chunk (state=3): >>> <<< 44109 1727204225.12158: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 44109 1727204225.12201: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc<<< 44109 1727204225.12227: stdout chunk (state=3): >>> # destroy importlib.util <<< 44109 1727204225.12283: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 44109 1727204225.12394: stdout chunk (state=3): >>># destroy _blake2 <<< 44109 1727204225.12400: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2<<< 44109 1727204225.12467: stdout chunk (state=3): >>> # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 44109 1727204225.12712: stdout chunk (state=3): >>> # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog<<< 44109 1727204225.12808: stdout chunk (state=3): >>> # destroy uuid # destroy selinux <<< 44109 1727204225.12814: stdout chunk (state=3): >>># destroy shutil <<< 44109 1727204225.12840: stdout chunk (state=3): >>># destroy distro # destroy distro.distro<<< 44109 1727204225.12857: stdout chunk (state=3): >>> # destroy argparse <<< 44109 1727204225.12952: stdout chunk (state=3): >>># destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 44109 1727204225.12986: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues <<< 44109 1727204225.13030: stdout chunk (state=3): >>># destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal<<< 44109 1727204225.13078: stdout chunk (state=3): >>> # destroy pickle # destroy _compat_pickle # destroy _pickle <<< 44109 1727204225.13107: stdout chunk (state=3): >>># destroy queue # destroy _heapq <<< 44109 1727204225.13145: stdout chunk (state=3): >>># destroy _queue # destroy multiprocessing.reduction <<< 44109 1727204225.13188: stdout chunk (state=3): >>># destroy selectors # destroy shlex # destroy fcntl <<< 44109 1727204225.13250: stdout chunk (state=3): >>># destroy datetime<<< 44109 1727204225.13265: stdout chunk (state=3): >>> # destroy subprocess # destroy base64 <<< 44109 1727204225.13324: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux<<< 44109 1727204225.13367: stdout chunk (state=3): >>> # destroy getpass # destroy pwd # destroy termios<<< 44109 1727204225.13378: stdout chunk (state=3): >>> # destroy json <<< 44109 1727204225.13432: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob<<< 44109 1727204225.13472: stdout chunk (state=3): >>> # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno<<< 44109 1727204225.13504: stdout chunk (state=3): >>> # destroy multiprocessing.connection # destroy tempfile<<< 44109 1727204225.13507: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 44109 1727204225.13528: stdout chunk (state=3): >>> <<< 44109 1727204225.13601: stdout chunk (state=3): >>># destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna <<< 44109 1727204225.13660: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 44109 1727204225.13664: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian<<< 44109 1727204225.13698: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser<<< 44109 1727204225.13728: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid<<< 44109 1727204225.13747: stdout chunk (state=3): >>> # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 44109 1727204225.13781: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 44109 1727204225.13916: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 44109 1727204225.13920: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings<<< 44109 1727204225.13923: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap_external<<< 44109 1727204225.13925: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 44109 1727204225.13927: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix <<< 44109 1727204225.13929: stdout chunk (state=3): >>># destroy re._compiler # destroy enum<<< 44109 1727204225.13984: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc<<< 44109 1727204225.13988: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections<<< 44109 1727204225.14030: stdout chunk (state=3): >>> # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 44109 1727204225.14059: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 44109 1727204225.14089: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 44109 1727204225.14103: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 44109 1727204225.14132: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 44109 1727204225.14167: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux<<< 44109 1727204225.14187: stdout chunk (state=3): >>> <<< 44109 1727204225.14203: stdout chunk (state=3): >>># destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 44109 1727204225.14509: stdout chunk (state=3): >>># destroy sys.monitoring<<< 44109 1727204225.14539: stdout chunk (state=3): >>> <<< 44109 1727204225.14558: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 44109 1727204225.14591: stdout chunk (state=3): >>># destroy platform <<< 44109 1727204225.14618: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 44109 1727204225.14685: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib <<< 44109 1727204225.14695: stdout chunk (state=3): >>># destroy copyreg <<< 44109 1727204225.14742: stdout chunk (state=3): >>># destroy contextlib # destroy _typing <<< 44109 1727204225.14792: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error<<< 44109 1727204225.14835: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 44109 1727204225.14864: stdout chunk (state=3): >>> # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 44109 1727204225.15004: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 44109 1727204225.15119: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437<<< 44109 1727204225.15384: stdout chunk (state=3): >>> # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 44109 1727204225.15423: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 44109 1727204225.15445: stdout chunk (state=3): >>> # clear sys.audit hooks <<< 44109 1727204225.15990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204225.16004: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 44109 1727204225.16139: stderr chunk (state=3): >>><<< 44109 1727204225.16170: stdout chunk (state=3): >>><<< 44109 1727204225.16470: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83adfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83b12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b838e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b838e6060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83923e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83923f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8395b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8395bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8393bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83939280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83921040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8397f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8397e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8393a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8397cc20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839202c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839b0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839b0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8391ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839c8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839c9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839cacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839ca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839cbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839cb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836c7c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836f0770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f04d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836f06b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836f1070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b836f19a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f0920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836c5df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f2d80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f1ac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b839b2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8371f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837434d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837a0230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837a2990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837a0350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83769220> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b835a1340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b837422d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836f3ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8b837428d0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_3m_0209k/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836070b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b835e5fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b835e5130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b836053a0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b8363aa20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8363a7b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8363a0c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8363a600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83b129c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b8363b740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b8363b950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b8363be60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f25c40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f27860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2c260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2d400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2fef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b839cab40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f33e90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f32960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f326c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f32c30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f2e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f78140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f782c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f79d30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f79af0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f7c2f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f7a420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f7fad0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f7c4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f80b00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f7d010> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f80bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f78470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82e0c350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82e0d4f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f82ae0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82f83e90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f82750> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82e11580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e12300> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b83607020> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e12150> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e134d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82e1dee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e18e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82f06a20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82ffe6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e1e1b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e1e000> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb2300> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af0140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82af04a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82e9b1a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb2e40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb0aa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb1340> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82af3440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af2cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82af2ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af2120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af3500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82b55fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82af3fe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82eb06b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82b578c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82b56c90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82b8e2a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82b7e0f0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b82ba5b80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82ba5be0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8b8293a780> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82939310> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82b7f200> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82983200> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82981250> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82983500> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8b82982180> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "04", "epoch": "1727204224", "epoch_int": "1727204224", "date": "2024-09-24", "time": "14:57:04", "iso8601_micro": "2024-09-24T18:57:04.568276Z", "iso8601": "2024-09-24T18:57:04Z", "iso8601_basic": "20240924T145704568276", "iso8601_basic_short": "20240924T145704", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2923, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 608, "free": 2923}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 815, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785198592, "block_size": 4096, "block_total": 65519099, "block_available": 63912402, "block_used": 1606697, "inode_total": 131070960, "inode_available": 131027257, "inode_used": 43703, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.47021484375, "5m": 0.509765625, "15m": 0.29931640625}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 44109 1727204225.17982: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204225.17986: _low_level_execute_command(): starting 44109 1727204225.17988: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204223.2742577-44248-202540709669600/ > /dev/null 2>&1 && sleep 0' 44109 1727204225.18582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204225.18593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204225.18596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204225.18599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204225.18635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204225.18820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204225.21621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204225.21625: stdout chunk (state=3): >>><<< 44109 1727204225.21681: stderr chunk (state=3): >>><<< 44109 1727204225.21685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204225.21687: handler run complete 44109 1727204225.21768: variable 'ansible_facts' from source: unknown 44109 1727204225.21867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204225.22265: variable 'ansible_facts' from source: unknown 44109 1727204225.22387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204225.22542: attempt loop complete, returning result 44109 1727204225.22629: _execute() done 44109 1727204225.22633: dumping result to json 44109 1727204225.22635: done dumping result, returning 44109 1727204225.22638: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-ed67-a560-0000000000af] 44109 1727204225.22641: sending task result for task 028d2410-947f-ed67-a560-0000000000af ok: [managed-node1] 44109 1727204225.23477: no more pending results, returning what we have 44109 1727204225.23480: results queue empty 44109 1727204225.23481: checking for any_errors_fatal 44109 1727204225.23483: done checking for any_errors_fatal 44109 1727204225.23483: checking for max_fail_percentage 44109 1727204225.23485: done checking for max_fail_percentage 44109 1727204225.23486: checking to see if all hosts have failed and the running result is not ok 44109 1727204225.23487: done checking to see if all hosts have failed 44109 1727204225.23488: getting the remaining hosts for this loop 44109 1727204225.23489: done getting the remaining hosts for this loop 44109 1727204225.23493: getting the next task for host managed-node1 44109 1727204225.23584: done getting next task for host managed-node1 44109 1727204225.23586: ^ task is: TASK: meta (flush_handlers) 44109 1727204225.23588: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204225.23592: getting variables 44109 1727204225.23593: in VariableManager get_vars() 44109 1727204225.23621: Calling all_inventory to load vars for managed-node1 44109 1727204225.23624: Calling groups_inventory to load vars for managed-node1 44109 1727204225.23627: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204225.23633: done sending task result for task 028d2410-947f-ed67-a560-0000000000af 44109 1727204225.23635: WORKER PROCESS EXITING 44109 1727204225.23643: Calling all_plugins_play to load vars for managed-node1 44109 1727204225.23646: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204225.23648: Calling groups_plugins_play to load vars for managed-node1 44109 1727204225.23874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204225.24071: done with get_vars() 44109 1727204225.24083: done getting variables 44109 1727204225.24155: in VariableManager get_vars() 44109 1727204225.24164: Calling all_inventory to load vars for managed-node1 44109 1727204225.24166: Calling groups_inventory to load vars for managed-node1 44109 1727204225.24169: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204225.24173: Calling all_plugins_play to load vars for managed-node1 44109 1727204225.24178: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204225.24181: Calling groups_plugins_play to load vars for managed-node1 44109 1727204225.24323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204225.24534: done with get_vars() 44109 1727204225.24547: done queuing things up, now waiting for results queue to drain 44109 1727204225.24549: results queue empty 44109 1727204225.24550: checking for any_errors_fatal 44109 1727204225.24552: done checking for any_errors_fatal 44109 1727204225.24553: checking for max_fail_percentage 44109 1727204225.24554: done checking for max_fail_percentage 44109 1727204225.24555: checking to see if all hosts have failed and the running result is not ok 44109 1727204225.24560: done checking to see if all hosts have failed 44109 1727204225.24560: getting the remaining hosts for this loop 44109 1727204225.24561: done getting the remaining hosts for this loop 44109 1727204225.24564: getting the next task for host managed-node1 44109 1727204225.24568: done getting next task for host managed-node1 44109 1727204225.24570: ^ task is: TASK: Include the task 'el_repo_setup.yml' 44109 1727204225.24572: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204225.24574: getting variables 44109 1727204225.24575: in VariableManager get_vars() 44109 1727204225.24592: Calling all_inventory to load vars for managed-node1 44109 1727204225.24594: Calling groups_inventory to load vars for managed-node1 44109 1727204225.24596: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204225.24601: Calling all_plugins_play to load vars for managed-node1 44109 1727204225.24603: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204225.24606: Calling groups_plugins_play to load vars for managed-node1 44109 1727204225.24748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204225.24944: done with get_vars() 44109 1727204225.24951: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:11 Tuesday 24 September 2024 14:57:05 -0400 (0:00:02.029) 0:00:02.046 ***** 44109 1727204225.25034: entering _queue_task() for managed-node1/include_tasks 44109 1727204225.25036: Creating lock for include_tasks 44109 1727204225.25333: worker is 1 (out of 1 available) 44109 1727204225.25351: exiting _queue_task() for managed-node1/include_tasks 44109 1727204225.25362: done queuing things up, now waiting for results queue to drain 44109 1727204225.25364: waiting for pending results... 44109 1727204225.25602: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 44109 1727204225.25710: in run() - task 028d2410-947f-ed67-a560-000000000006 44109 1727204225.25728: variable 'ansible_search_path' from source: unknown 44109 1727204225.25763: calling self._execute() 44109 1727204225.25843: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204225.25854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204225.25867: variable 'omit' from source: magic vars 44109 1727204225.25981: _execute() done 44109 1727204225.25990: dumping result to json 44109 1727204225.26002: done dumping result, returning 44109 1727204225.26019: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-ed67-a560-000000000006] 44109 1727204225.26028: sending task result for task 028d2410-947f-ed67-a560-000000000006 44109 1727204225.26255: done sending task result for task 028d2410-947f-ed67-a560-000000000006 44109 1727204225.26258: WORKER PROCESS EXITING 44109 1727204225.26304: no more pending results, returning what we have 44109 1727204225.26309: in VariableManager get_vars() 44109 1727204225.26341: Calling all_inventory to load vars for managed-node1 44109 1727204225.26344: Calling groups_inventory to load vars for managed-node1 44109 1727204225.26348: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204225.26364: Calling all_plugins_play to load vars for managed-node1 44109 1727204225.26374: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204225.26380: Calling groups_plugins_play to load vars for managed-node1 44109 1727204225.26700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204225.26880: done with get_vars() 44109 1727204225.26886: variable 'ansible_search_path' from source: unknown 44109 1727204225.26899: we have included files to process 44109 1727204225.26900: generating all_blocks data 44109 1727204225.26901: done generating all_blocks data 44109 1727204225.26902: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44109 1727204225.26903: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44109 1727204225.26905: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44109 1727204225.27564: in VariableManager get_vars() 44109 1727204225.27593: done with get_vars() 44109 1727204225.27605: done processing included file 44109 1727204225.27607: iterating over new_blocks loaded from include file 44109 1727204225.27609: in VariableManager get_vars() 44109 1727204225.27619: done with get_vars() 44109 1727204225.27620: filtering new block on tags 44109 1727204225.27634: done filtering new block on tags 44109 1727204225.27637: in VariableManager get_vars() 44109 1727204225.27647: done with get_vars() 44109 1727204225.27649: filtering new block on tags 44109 1727204225.27665: done filtering new block on tags 44109 1727204225.27668: in VariableManager get_vars() 44109 1727204225.27687: done with get_vars() 44109 1727204225.27689: filtering new block on tags 44109 1727204225.27702: done filtering new block on tags 44109 1727204225.27704: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 44109 1727204225.27710: extending task lists for all hosts with included blocks 44109 1727204225.27756: done extending task lists 44109 1727204225.27757: done processing included files 44109 1727204225.27758: results queue empty 44109 1727204225.27759: checking for any_errors_fatal 44109 1727204225.27760: done checking for any_errors_fatal 44109 1727204225.27761: checking for max_fail_percentage 44109 1727204225.27762: done checking for max_fail_percentage 44109 1727204225.27762: checking to see if all hosts have failed and the running result is not ok 44109 1727204225.27763: done checking to see if all hosts have failed 44109 1727204225.27764: getting the remaining hosts for this loop 44109 1727204225.27765: done getting the remaining hosts for this loop 44109 1727204225.27767: getting the next task for host managed-node1 44109 1727204225.27771: done getting next task for host managed-node1 44109 1727204225.27773: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 44109 1727204225.27779: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204225.27781: getting variables 44109 1727204225.27782: in VariableManager get_vars() 44109 1727204225.27797: Calling all_inventory to load vars for managed-node1 44109 1727204225.27799: Calling groups_inventory to load vars for managed-node1 44109 1727204225.27801: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204225.27806: Calling all_plugins_play to load vars for managed-node1 44109 1727204225.27808: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204225.27811: Calling groups_plugins_play to load vars for managed-node1 44109 1727204225.27982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204225.28163: done with get_vars() 44109 1727204225.28171: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.032) 0:00:02.079 ***** 44109 1727204225.28297: entering _queue_task() for managed-node1/setup 44109 1727204225.28897: worker is 1 (out of 1 available) 44109 1727204225.28914: exiting _queue_task() for managed-node1/setup 44109 1727204225.28923: done queuing things up, now waiting for results queue to drain 44109 1727204225.28924: waiting for pending results... 44109 1727204225.29444: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 44109 1727204225.29508: in run() - task 028d2410-947f-ed67-a560-0000000000c0 44109 1727204225.29528: variable 'ansible_search_path' from source: unknown 44109 1727204225.29565: variable 'ansible_search_path' from source: unknown 44109 1727204225.29584: calling self._execute() 44109 1727204225.29655: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204225.29673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204225.29783: variable 'omit' from source: magic vars 44109 1727204225.30258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204225.33138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204225.33215: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204225.33253: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204225.33320: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204225.33406: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204225.33687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204225.33690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204225.33693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204225.33787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204225.33811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204225.34343: variable 'ansible_facts' from source: unknown 44109 1727204225.34346: variable 'network_test_required_facts' from source: task vars 44109 1727204225.34560: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 44109 1727204225.34563: variable 'omit' from source: magic vars 44109 1727204225.34565: variable 'omit' from source: magic vars 44109 1727204225.34567: variable 'omit' from source: magic vars 44109 1727204225.34694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204225.34726: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204225.34748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204225.34773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204225.34984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204225.34987: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204225.34990: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204225.34994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204225.35180: Set connection var ansible_connection to ssh 44109 1727204225.35183: Set connection var ansible_timeout to 10 44109 1727204225.35186: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204225.35188: Set connection var ansible_pipelining to False 44109 1727204225.35190: Set connection var ansible_shell_executable to /bin/sh 44109 1727204225.35219: Set connection var ansible_shell_type to sh 44109 1727204225.35246: variable 'ansible_shell_executable' from source: unknown 44109 1727204225.35315: variable 'ansible_connection' from source: unknown 44109 1727204225.35328: variable 'ansible_module_compression' from source: unknown 44109 1727204225.35337: variable 'ansible_shell_type' from source: unknown 44109 1727204225.35344: variable 'ansible_shell_executable' from source: unknown 44109 1727204225.35352: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204225.35360: variable 'ansible_pipelining' from source: unknown 44109 1727204225.35367: variable 'ansible_timeout' from source: unknown 44109 1727204225.35377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204225.35687: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204225.35704: variable 'omit' from source: magic vars 44109 1727204225.35713: starting attempt loop 44109 1727204225.35720: running the handler 44109 1727204225.35746: _low_level_execute_command(): starting 44109 1727204225.35763: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204225.36446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204225.36461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204225.36477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204225.36520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204225.36538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204225.36628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204225.36641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204225.36656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204225.36771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204225.39227: stdout chunk (state=3): >>>/root <<< 44109 1727204225.39326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204225.39446: stderr chunk (state=3): >>><<< 44109 1727204225.39449: stdout chunk (state=3): >>><<< 44109 1727204225.39469: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204225.39584: _low_level_execute_command(): starting 44109 1727204225.39587: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423 `" && echo ansible-tmp-1727204225.394872-44343-59979469677423="` echo /root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423 `" ) && sleep 0' 44109 1727204225.41085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204225.41089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204225.41092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204225.41096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204225.41197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204225.41200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204225.41394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44109 1727204225.44277: stdout chunk (state=3): >>>ansible-tmp-1727204225.394872-44343-59979469677423=/root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423 <<< 44109 1727204225.45194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204225.45198: stdout chunk (state=3): >>><<< 44109 1727204225.45200: stderr chunk (state=3): >>><<< 44109 1727204225.45203: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204225.394872-44343-59979469677423=/root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44109 1727204225.45206: variable 'ansible_module_compression' from source: unknown 44109 1727204225.45208: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44109 1727204225.45210: variable 'ansible_facts' from source: unknown 44109 1727204225.46157: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/AnsiballZ_setup.py 44109 1727204225.46620: Sending initial data 44109 1727204225.46624: Sent initial data (152 bytes) 44109 1727204225.47808: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204225.47939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204225.47952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204225.48234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204225.50198: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204225.50403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204225.50483: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp6u68ugoq /root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/AnsiballZ_setup.py <<< 44109 1727204225.50531: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/AnsiballZ_setup.py" <<< 44109 1727204225.50598: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp6u68ugoq" to remote "/root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/AnsiballZ_setup.py" <<< 44109 1727204225.53612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204225.53615: stdout chunk (state=3): >>><<< 44109 1727204225.53618: stderr chunk (state=3): >>><<< 44109 1727204225.53620: done transferring module to remote 44109 1727204225.53657: _low_level_execute_command(): starting 44109 1727204225.53661: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/ /root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/AnsiballZ_setup.py && sleep 0' 44109 1727204225.54895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204225.55282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204225.55293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204225.57482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204225.57486: stdout chunk (state=3): >>><<< 44109 1727204225.57488: stderr chunk (state=3): >>><<< 44109 1727204225.57490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204225.57492: _low_level_execute_command(): starting 44109 1727204225.57494: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/AnsiballZ_setup.py && sleep 0' 44109 1727204225.58720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204225.58730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204225.58742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204225.58756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204225.58770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204225.58777: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204225.58788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204225.58812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204225.58830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204225.58833: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204225.58843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204225.58853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204225.58988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204225.59159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204225.59244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204225.61677: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 44109 1727204225.61707: stdout chunk (state=3): >>>import _imp # builtin <<< 44109 1727204225.61934: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 44109 1727204225.61939: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 44109 1727204225.61982: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.62021: stdout chunk (state=3): >>>import '_codecs' # <<< 44109 1727204225.62034: stdout chunk (state=3): >>>import 'codecs' # <<< 44109 1727204225.62062: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 44109 1727204225.62092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 44109 1727204225.62129: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78729e7b30> <<< 44109 1727204225.62144: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872a1aa50> <<< 44109 1727204225.62217: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 44109 1727204225.62474: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # <<< 44109 1727204225.62481: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 44109 1727204225.62483: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 44109 1727204225.62527: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 44109 1727204225.62533: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 44109 1727204225.62559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 44109 1727204225.62566: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787282d130> <<< 44109 1727204225.62619: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 44109 1727204225.62639: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787282e060> <<< 44109 1727204225.62661: stdout chunk (state=3): >>>import 'site' # <<< 44109 1727204225.62704: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 44109 1727204225.63116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 44109 1727204225.63119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 44109 1727204225.63145: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.63165: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 44109 1727204225.63230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 44109 1727204225.63233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 44109 1727204225.63264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 44109 1727204225.63310: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787286bf50> <<< 44109 1727204225.63436: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 44109 1727204225.63440: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728800e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 44109 1727204225.63471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.63496: stdout chunk (state=3): >>>import 'itertools' # <<< 44109 1727204225.63544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728a3920> <<< 44109 1727204225.63559: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 44109 1727204225.63782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728a3fb0> import '_collections' # <<< 44109 1727204225.63817: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872883bc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872881340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872869100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 44109 1727204225.63840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 44109 1727204225.63861: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 44109 1727204225.63913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 44109 1727204225.63938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 44109 1727204225.63972: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728c78f0> <<< 44109 1727204225.64012: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728c6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728821e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728c4d10> <<< 44109 1727204225.64066: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 44109 1727204225.64094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872868380> <<< 44109 1727204225.64121: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 44109 1727204225.64164: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78728f4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f4cb0> <<< 44109 1727204225.64335: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78728f50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872866ea0> <<< 44109 1727204225.64339: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 44109 1727204225.64356: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f5460> import 'importlib.machinery' # <<< 44109 1727204225.64385: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 44109 1727204225.64463: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f6660> <<< 44109 1727204225.64485: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 44109 1727204225.64488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 44109 1727204225.64586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872910890> <<< 44109 1727204225.64589: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872911fd0> <<< 44109 1727204225.64668: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 44109 1727204225.64686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872912e70> <<< 44109 1727204225.64714: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78729134a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78729123c0> <<< 44109 1727204225.64791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 44109 1727204225.64808: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872913e60> <<< 44109 1727204225.64819: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872913590> <<< 44109 1727204225.64901: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f66c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 44109 1727204225.64927: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 44109 1727204225.64947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 44109 1727204225.65123: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872607d40> <<< 44109 1727204225.65126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872634860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726345c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872634890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 44109 1727204225.65185: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204225.65316: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78726351c0> <<< 44109 1727204225.65464: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204225.65573: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872635b80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872634a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872605ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 44109 1727204225.65588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872636f60> <<< 44109 1727204225.65622: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872635cd0> <<< 44109 1727204225.65634: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f6db0> <<< 44109 1727204225.65666: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 44109 1727204225.65731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.65759: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 44109 1727204225.65788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 44109 1727204225.65824: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787265b2c0> <<< 44109 1727204225.65906: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 44109 1727204225.65912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.65987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 44109 1727204225.65992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 44109 1727204225.66015: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787267f620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 44109 1727204225.66074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 44109 1727204225.66164: stdout chunk (state=3): >>>import 'ntpath' # <<< 44109 1727204225.66168: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726e03b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 44109 1727204225.66254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 44109 1727204225.66313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 44109 1727204225.66526: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726e2ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726e04a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726a9430> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78724ed460> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787267e420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872637ec0> <<< 44109 1727204225.66718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 44109 1727204225.66742: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f787267e540> <<< 44109 1727204225.67190: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_zle6jp38/ansible_setup_payload.zip' # zipimport: zlib available <<< 44109 1727204225.67243: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.67392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 44109 1727204225.67395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 44109 1727204225.67424: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 44109 1727204225.67465: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872557110> <<< 44109 1727204225.67469: stdout chunk (state=3): >>>import '_typing' # <<< 44109 1727204225.67665: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872536000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872535160> # zipimport: zlib available <<< 44109 1727204225.67701: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 44109 1727204225.67743: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.67765: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 44109 1727204225.69299: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.70717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872554fe0> <<< 44109 1727204225.70721: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872586a50> <<< 44109 1727204225.70745: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78725867e0> <<< 44109 1727204225.70784: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78725860f0> <<< 44109 1727204225.70845: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872586870> <<< 44109 1727204225.70871: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872557da0> import 'atexit' # <<< 44109 1727204225.70931: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78725877d0> <<< 44109 1727204225.71071: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872587a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 44109 1727204225.71083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872587f50> import 'pwd' # <<< 44109 1727204225.71108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 44109 1727204225.71125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 44109 1727204225.71174: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f31d90> <<< 44109 1727204225.71199: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f339b0> <<< 44109 1727204225.71291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f34380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 44109 1727204225.71335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 44109 1727204225.71354: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f35280> <<< 44109 1727204225.71380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 44109 1727204225.71410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 44109 1727204225.71491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 44109 1727204225.71518: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f37f50> <<< 44109 1727204225.71642: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872912de0> <<< 44109 1727204225.71645: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f36210> <<< 44109 1727204225.71648: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 44109 1727204225.71660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 44109 1727204225.71796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 44109 1727204225.72028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f3fe30> import '_tokenize' # <<< 44109 1727204225.72031: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f3e900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f3e660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 44109 1727204225.72033: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f3ebd0> <<< 44109 1727204225.72056: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f36720> <<< 44109 1727204225.72085: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204225.72122: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f83ef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f84200> <<< 44109 1727204225.72143: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 44109 1727204225.72228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 44109 1727204225.72253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f85ca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f85a60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 44109 1727204225.72293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 44109 1727204225.72339: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204225.72364: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f881d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f86360> <<< 44109 1727204225.72382: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 44109 1727204225.72435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.72543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 44109 1727204225.72557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 44109 1727204225.72578: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f8b980> <<< 44109 1727204225.72647: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f88380> <<< 44109 1727204225.72709: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204225.72743: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f8c7a0> <<< 44109 1727204225.72758: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f8cbc0> <<< 44109 1727204225.72818: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f8cb00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f843b0> <<< 44109 1727204225.72844: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 44109 1727204225.72856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 44109 1727204225.72891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 44109 1727204225.72935: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204225.72946: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871e18170> <<< 44109 1727204225.73373: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871e194f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f8e900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f8fcb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f8e510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.73494: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.73535: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 44109 1727204225.73538: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 44109 1727204225.73608: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.73741: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.74053: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.74913: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.75872: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 44109 1727204225.75879: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 44109 1727204225.75914: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.75961: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871e1d850> <<< 44109 1727204225.76042: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 44109 1727204225.76068: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e1ebd0> <<< 44109 1727204225.76129: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e19700> <<< 44109 1727204225.76209: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 44109 1727204225.76417: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 44109 1727204225.76581: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.76779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e1ecf0> # zipimport: zlib available <<< 44109 1727204225.77175: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.77845: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.77860: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 44109 1727204225.77899: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.77990: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 44109 1727204225.77993: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.78019: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.78097: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 44109 1727204225.78117: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.78213: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.78223: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 44109 1727204225.78320: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.78484: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.79009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e1f920> <<< 44109 1727204225.79061: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.79248: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # <<< 44109 1727204225.79259: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 44109 1727204225.79266: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 44109 1727204225.79295: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.79357: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.79409: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 44109 1727204225.79424: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.79482: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.79541: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.79630: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.79753: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 44109 1727204225.79799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.79955: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871e2a150> <<< 44109 1727204225.80037: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e25b20> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 44109 1727204225.80043: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.80189: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.80273: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.80354: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204225.80417: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 44109 1727204225.80467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 44109 1727204225.80527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 44109 1727204225.80561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 44109 1727204225.80615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 44109 1727204225.80670: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f02c00> <<< 44109 1727204225.80730: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ffe900> <<< 44109 1727204225.80867: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e2a420> <<< 44109 1727204225.80870: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e1c860> <<< 44109 1727204225.80897: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 44109 1727204225.80949: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # <<< 44109 1727204225.80961: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 44109 1727204225.81059: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 44109 1727204225.81120: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.81166: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.81283: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.81305: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.81419: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.81465: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.81522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 44109 1727204225.81531: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.81658: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.81764: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.81835: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.81848: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 44109 1727204225.81855: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.82131: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.82310: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.82335: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.82411: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 44109 1727204225.82442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 44109 1727204225.82461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 44109 1727204225.82488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 44109 1727204225.82522: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871eba570> <<< 44109 1727204225.82538: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 44109 1727204225.82591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 44109 1727204225.82601: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 44109 1727204225.82622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 44109 1727204225.82660: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 44109 1727204225.82664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871a783b0> <<< 44109 1727204225.82712: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204225.82721: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871a787a0> <<< 44109 1727204225.82787: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ea49e0> <<< 44109 1727204225.82809: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ebb110> <<< 44109 1727204225.83226: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871eb8c50> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871eb88c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871a7b740> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871a7aff0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871a7b1d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871a7a420> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 44109 1727204225.83364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871a7b8f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 44109 1727204225.83429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871ada3f0> <<< 44109 1727204225.83504: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ad8440> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871eb8980> <<< 44109 1727204225.83589: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 44109 1727204225.83592: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 44109 1727204225.83664: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.83743: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 44109 1727204225.83762: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.83835: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.83913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 44109 1727204225.83946: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.83957: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 44109 1727204225.83990: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.84050: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 44109 1727204225.84168: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.84188: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 44109 1727204225.84203: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.84291: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # <<< 44109 1727204225.84382: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.84404: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.84578: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 44109 1727204225.84585: stdout chunk (state=3): >>> <<< 44109 1727204225.84669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 44109 1727204225.84677: stdout chunk (state=3): >>> <<< 44109 1727204225.84697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 44109 1727204225.84700: stdout chunk (state=3): >>> <<< 44109 1727204225.84736: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.85629: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.85700: stdout chunk (state=3): >>> <<< 44109 1727204225.86442: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 44109 1727204225.86490: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.86529: stdout chunk (state=3): >>> <<< 44109 1727204225.86584: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.86618: stdout chunk (state=3): >>> <<< 44109 1727204225.86702: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.86749: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.86829: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available<<< 44109 1727204225.86878: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204225.86940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 44109 1727204225.86944: stdout chunk (state=3): >>> <<< 44109 1727204225.87065: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.87137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 44109 1727204225.87192: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.87226: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.87262: stdout chunk (state=3): >>> <<< 44109 1727204225.87299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 44109 1727204225.87354: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.87413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 44109 1727204225.87430: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.87628: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204225.87703: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 44109 1727204225.87728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 44109 1727204225.87774: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ada810> <<< 44109 1727204225.87816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 44109 1727204225.87873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 44109 1727204225.88096: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871adb3e0> <<< 44109 1727204225.88120: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available<<< 44109 1727204225.88204: stdout chunk (state=3): >>> <<< 44109 1727204225.88255: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.88351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 44109 1727204225.88392: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.88421: stdout chunk (state=3): >>> <<< 44109 1727204225.88565: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.88682: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 44109 1727204225.88712: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.88824: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.88984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available<<< 44109 1727204225.89297: stdout chunk (state=3): >>> # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 44109 1727204225.89327: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204225.89441: stdout chunk (state=3): >>> # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204225.89471: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871b166f0> <<< 44109 1727204225.89778: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871adbf20><<< 44109 1727204225.89804: stdout chunk (state=3): >>> <<< 44109 1727204225.89861: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 44109 1727204225.89940: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.89960: stdout chunk (state=3): >>> <<< 44109 1727204225.90063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available<<< 44109 1727204225.90068: stdout chunk (state=3): >>> <<< 44109 1727204225.90321: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 44109 1727204225.90328: stdout chunk (state=3): >>> <<< 44109 1727204225.90506: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.90512: stdout chunk (state=3): >>> <<< 44109 1727204225.90741: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 44109 1727204225.90747: stdout chunk (state=3): >>> <<< 44109 1727204225.90761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 44109 1727204225.90766: stdout chunk (state=3): >>> <<< 44109 1727204225.90790: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.90794: stdout chunk (state=3): >>> <<< 44109 1727204225.90865: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.90868: stdout chunk (state=3): >>> <<< 44109 1727204225.90930: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 44109 1727204225.90937: stdout chunk (state=3): >>> <<< 44109 1727204225.90954: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.91023: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204225.91102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 44109 1727204225.91129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 44109 1727204225.91178: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204225.91211: stdout chunk (state=3): >>> # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204225.91218: stdout chunk (state=3): >>> <<< 44109 1727204225.91241: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871b2a270> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871b29e80> import 'ansible.module_utils.facts.system.user' # <<< 44109 1727204225.91281: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204225.91322: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 44109 1727204225.91464: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 44109 1727204225.91489: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.91740: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204225.91743: stdout chunk (state=3): >>> <<< 44109 1727204225.92002: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 44109 1727204225.92298: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.92311: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.92374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 44109 1727204225.92383: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 44109 1727204225.92391: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.92422: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.92446: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.92674: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.92903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 44109 1727204225.92908: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 44109 1727204225.92918: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.93198: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.93294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 44109 1727204225.93317: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.93358: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.93414: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.94384: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.95239: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 44109 1727204225.95262: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.95432: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.95590: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 44109 1727204225.95794: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.95899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 44109 1727204225.95918: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.96158: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.96389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 44109 1727204225.96420: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.96423: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.96449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 44109 1727204225.96470: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.96522: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.96572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 44109 1727204225.96594: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.96745: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.96879: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.97214: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.97549: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 44109 1727204225.97572: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.97633: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.97685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 44109 1727204225.97697: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.97738: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.97755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 44109 1727204225.97904: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204225.97987: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 44109 1727204225.98002: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.98036: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.98066: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 44109 1727204225.98140: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.98216: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 44109 1727204225.98231: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.98311: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.98390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 44109 1727204225.98406: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.98860: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.99296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 44109 1727204225.99315: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.99398: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.99702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 44109 1727204225.99721: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.99766: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 44109 1727204225.99776: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204225.99922: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.00064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 44109 1727204226.00094: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 44109 1727204226.00154: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.00221: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 44109 1727204226.00240: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.00282: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.00285: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.00357: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.00601: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.00609: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204226.00628: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 44109 1727204226.00645: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 44109 1727204226.00670: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.00727: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.00804: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 44109 1727204226.00810: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.01149: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.01470: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 44109 1727204226.01490: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.01555: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.01697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 44109 1727204226.01760: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 44109 1727204226.01778: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.01899: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.02027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 44109 1727204226.02051: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.02188: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.02319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 44109 1727204226.02325: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 44109 1727204226.02496: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.03845: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 44109 1727204226.03860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 44109 1727204226.03889: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 44109 1727204226.03925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 44109 1727204226.04098: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78718eb890> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718e9dc0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718e85c0> <<< 44109 1727204226.04740: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "06", "epoch": "1727204226", "epoch_int": "1727204226", "date": "2024-09-24", "time": "14:57:06", "iso8601_micro": "2024-09-24T18:57:06.036150Z", "iso8601": "2024-09-24T18:57:06Z", "iso8601_basic": "20240924T145706036150", "iso8601_basic_short": "20240924T145706", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44109 1727204226.05771: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 44109 1727204226.05774: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 44109 1727204226.05782: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external <<< 44109 1727204226.05810: stdout chunk (state=3): >>># cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc <<< 44109 1727204226.05813: stdout chunk (state=3): >>># cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path<<< 44109 1727204226.05899: stdout chunk (state=3): >>> # cleanup[2] removing os # cleanup[2] removing _sitebuiltins<<< 44109 1727204226.05974: stdout chunk (state=3): >>> # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib <<< 44109 1727204226.05980: stdout chunk (state=3): >>># cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder <<< 44109 1727204226.05985: stdout chunk (state=3): >>># cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd <<< 44109 1727204226.06104: stdout chunk (state=3): >>># destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec <<< 44109 1727204226.06414: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 44109 1727204226.06607: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 44109 1727204226.07140: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 44109 1727204226.07210: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 44109 1727204226.07262: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 44109 1727204226.07266: stdout chunk (state=3): >>># destroy ntpath <<< 44109 1727204226.07310: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 44109 1727204226.07423: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 44109 1727204226.07435: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 44109 1727204226.07522: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 44109 1727204226.07546: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 44109 1727204226.07608: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors <<< 44109 1727204226.07611: stdout chunk (state=3): >>># destroy _multiprocessing <<< 44109 1727204226.07723: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 44109 1727204226.07766: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 44109 1727204226.07789: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 44109 1727204226.08122: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 44109 1727204226.08155: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 44109 1727204226.08302: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 44109 1727204226.08326: stdout chunk (state=3): >>># destroy _collections <<< 44109 1727204226.08367: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 44109 1727204226.08413: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 44109 1727204226.08461: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 44109 1727204226.08514: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 44109 1727204226.08518: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 44109 1727204226.08767: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 44109 1727204226.08774: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 44109 1727204226.08809: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 44109 1727204226.09924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204226.09927: stdout chunk (state=3): >>><<< 44109 1727204226.09930: stderr chunk (state=3): >>><<< 44109 1727204226.10112: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78729e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872a1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787282d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787282e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787286bf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728800e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728a3920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728a3fb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872883bc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872881340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872869100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728c78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728c6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728821e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728c4d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872868380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78728f4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f4cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78728f50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872866ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f5460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f6660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872910890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872911fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872912e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78729134a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78729123c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872913e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872913590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f66c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872607d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872634860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726345c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872634890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78726351c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872635b80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872634a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872605ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872636f60> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872635cd0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78728f6db0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787265b2c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787267f620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726e03b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726e2ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726e04a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78726a9430> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78724ed460> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787267e420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872637ec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f787267e540> # zipimport: found 103 names in '/tmp/ansible_setup_payload_zle6jp38/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872557110> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872536000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872535160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872554fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872586a50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78725867e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78725860f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872586870> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872557da0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78725877d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872587a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872587f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f31d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f339b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f34380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f35280> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f37f50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872912de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f36210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f3fe30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f3e900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f3e660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f3ebd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f36720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f83ef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f84200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f85ca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f85a60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f881d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f86360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f8b980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f88380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f8c7a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f8cbc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f8cb00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f843b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871e18170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871e194f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f8e900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871f8fcb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f8e510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871e1d850> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e1ebd0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e19700> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e1ecf0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e1f920> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871e2a150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e25b20> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f02c00> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ffe900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e2a420> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871e1c860> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871eba570> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871a783b0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871a787a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ea49e0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ebb110> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871eb8c50> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871eb88c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871a7b740> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871a7aff0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871a7b1d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871a7a420> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871a7b8f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871ada3f0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ad8440> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871eb8980> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ada810> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871adb3e0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871b166f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871adbf20> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871b2a270> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871b29e80> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78718eb890> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718e9dc0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718e85c0> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "06", "epoch": "1727204226", "epoch_int": "1727204226", "date": "2024-09-24", "time": "14:57:06", "iso8601_micro": "2024-09-24T18:57:06.036150Z", "iso8601": "2024-09-24T18:57:06Z", "iso8601_basic": "20240924T145706036150", "iso8601_basic_short": "20240924T145706", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 44109 1727204226.11466: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204226.11469: _low_level_execute_command(): starting 44109 1727204226.11471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204225.394872-44343-59979469677423/ > /dev/null 2>&1 && sleep 0' 44109 1727204226.11869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204226.11900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204226.11913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204226.11928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204226.12046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204226.15109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204226.15112: stdout chunk (state=3): >>><<< 44109 1727204226.15115: stderr chunk (state=3): >>><<< 44109 1727204226.15136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204226.15146: handler run complete 44109 1727204226.15195: variable 'ansible_facts' from source: unknown 44109 1727204226.15255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204226.15381: variable 'ansible_facts' from source: unknown 44109 1727204226.15430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204226.15495: attempt loop complete, returning result 44109 1727204226.15581: _execute() done 44109 1727204226.15586: dumping result to json 44109 1727204226.15588: done dumping result, returning 44109 1727204226.15590: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-ed67-a560-0000000000c0] 44109 1727204226.15593: sending task result for task 028d2410-947f-ed67-a560-0000000000c0 ok: [managed-node1] 44109 1727204226.16022: no more pending results, returning what we have 44109 1727204226.16025: results queue empty 44109 1727204226.16026: checking for any_errors_fatal 44109 1727204226.16027: done checking for any_errors_fatal 44109 1727204226.16028: checking for max_fail_percentage 44109 1727204226.16030: done checking for max_fail_percentage 44109 1727204226.16030: checking to see if all hosts have failed and the running result is not ok 44109 1727204226.16031: done checking to see if all hosts have failed 44109 1727204226.16032: getting the remaining hosts for this loop 44109 1727204226.16033: done getting the remaining hosts for this loop 44109 1727204226.16037: getting the next task for host managed-node1 44109 1727204226.16045: done getting next task for host managed-node1 44109 1727204226.16047: ^ task is: TASK: Check if system is ostree 44109 1727204226.16050: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204226.16053: getting variables 44109 1727204226.16054: in VariableManager get_vars() 44109 1727204226.16087: Calling all_inventory to load vars for managed-node1 44109 1727204226.16090: Calling groups_inventory to load vars for managed-node1 44109 1727204226.16094: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204226.16100: done sending task result for task 028d2410-947f-ed67-a560-0000000000c0 44109 1727204226.16102: WORKER PROCESS EXITING 44109 1727204226.16111: Calling all_plugins_play to load vars for managed-node1 44109 1727204226.16115: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204226.16118: Calling groups_plugins_play to load vars for managed-node1 44109 1727204226.16448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204226.16647: done with get_vars() 44109 1727204226.16656: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:57:06 -0400 (0:00:00.884) 0:00:02.963 ***** 44109 1727204226.16749: entering _queue_task() for managed-node1/stat 44109 1727204226.17086: worker is 1 (out of 1 available) 44109 1727204226.17096: exiting _queue_task() for managed-node1/stat 44109 1727204226.17107: done queuing things up, now waiting for results queue to drain 44109 1727204226.17108: waiting for pending results... 44109 1727204226.17274: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 44109 1727204226.17397: in run() - task 028d2410-947f-ed67-a560-0000000000c2 44109 1727204226.17416: variable 'ansible_search_path' from source: unknown 44109 1727204226.17502: variable 'ansible_search_path' from source: unknown 44109 1727204226.17507: calling self._execute() 44109 1727204226.17542: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204226.17556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204226.17572: variable 'omit' from source: magic vars 44109 1727204226.18089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204226.18349: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204226.18409: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204226.18447: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204226.18592: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204226.18603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204226.18632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204226.18661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204226.18701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204226.18830: Evaluated conditional (not __network_is_ostree is defined): True 44109 1727204226.18840: variable 'omit' from source: magic vars 44109 1727204226.18883: variable 'omit' from source: magic vars 44109 1727204226.18928: variable 'omit' from source: magic vars 44109 1727204226.18955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204226.18992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204226.19016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204226.19047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204226.19062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204226.19098: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204226.19108: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204226.19117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204226.19224: Set connection var ansible_connection to ssh 44109 1727204226.19244: Set connection var ansible_timeout to 10 44109 1727204226.19352: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204226.19355: Set connection var ansible_pipelining to False 44109 1727204226.19358: Set connection var ansible_shell_executable to /bin/sh 44109 1727204226.19360: Set connection var ansible_shell_type to sh 44109 1727204226.19362: variable 'ansible_shell_executable' from source: unknown 44109 1727204226.19363: variable 'ansible_connection' from source: unknown 44109 1727204226.19366: variable 'ansible_module_compression' from source: unknown 44109 1727204226.19368: variable 'ansible_shell_type' from source: unknown 44109 1727204226.19369: variable 'ansible_shell_executable' from source: unknown 44109 1727204226.19371: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204226.19373: variable 'ansible_pipelining' from source: unknown 44109 1727204226.19378: variable 'ansible_timeout' from source: unknown 44109 1727204226.19380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204226.19520: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204226.19533: variable 'omit' from source: magic vars 44109 1727204226.19541: starting attempt loop 44109 1727204226.19546: running the handler 44109 1727204226.19582: _low_level_execute_command(): starting 44109 1727204226.19607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204226.20464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204226.20493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204226.20513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204226.20538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204226.20669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204226.23583: stdout chunk (state=3): >>>/root <<< 44109 1727204226.23587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204226.23589: stdout chunk (state=3): >>><<< 44109 1727204226.23591: stderr chunk (state=3): >>><<< 44109 1727204226.23595: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204226.23604: _low_level_execute_command(): starting 44109 1727204226.23607: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265 `" && echo ansible-tmp-1727204226.234983-44392-189388378635265="` echo /root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265 `" ) && sleep 0' 44109 1727204226.24682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204226.24883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204226.25115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204226.25209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204226.28320: stdout chunk (state=3): >>>ansible-tmp-1727204226.234983-44392-189388378635265=/root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265 <<< 44109 1727204226.28885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204226.28889: stdout chunk (state=3): >>><<< 44109 1727204226.28891: stderr chunk (state=3): >>><<< 44109 1727204226.28894: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204226.234983-44392-189388378635265=/root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204226.28896: variable 'ansible_module_compression' from source: unknown 44109 1727204226.28898: ANSIBALLZ: Using lock for stat 44109 1727204226.28900: ANSIBALLZ: Acquiring lock 44109 1727204226.28902: ANSIBALLZ: Lock acquired: 139907468546976 44109 1727204226.28904: ANSIBALLZ: Creating module 44109 1727204226.41813: ANSIBALLZ: Writing module into payload 44109 1727204226.41969: ANSIBALLZ: Writing module 44109 1727204226.41999: ANSIBALLZ: Renaming module 44109 1727204226.42015: ANSIBALLZ: Done creating module 44109 1727204226.42039: variable 'ansible_facts' from source: unknown 44109 1727204226.42163: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/AnsiballZ_stat.py 44109 1727204226.42288: Sending initial data 44109 1727204226.42392: Sent initial data (152 bytes) 44109 1727204226.43043: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204226.43046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204226.43056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204226.43140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204226.43166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204226.43174: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204226.43183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204226.43197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204226.43317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204226.45789: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 44109 1727204226.45792: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204226.45864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204226.45944: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpgxogso8k /root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/AnsiballZ_stat.py <<< 44109 1727204226.45948: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/AnsiballZ_stat.py" <<< 44109 1727204226.46028: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpgxogso8k" to remote "/root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/AnsiballZ_stat.py" <<< 44109 1727204226.46032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/AnsiballZ_stat.py" <<< 44109 1727204226.46719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204226.46762: stderr chunk (state=3): >>><<< 44109 1727204226.46765: stdout chunk (state=3): >>><<< 44109 1727204226.46785: done transferring module to remote 44109 1727204226.46798: _low_level_execute_command(): starting 44109 1727204226.46802: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/ /root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/AnsiballZ_stat.py && sleep 0' 44109 1727204226.47238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204226.47241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204226.47243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204226.47246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204226.47247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204226.47249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204226.47299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204226.47302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204226.47388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204226.50144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204226.50167: stderr chunk (state=3): >>><<< 44109 1727204226.50170: stdout chunk (state=3): >>><<< 44109 1727204226.50187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204226.50190: _low_level_execute_command(): starting 44109 1727204226.50194: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/AnsiballZ_stat.py && sleep 0' 44109 1727204226.50628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204226.50633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204226.50636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204226.50639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204226.50687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204226.50690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204226.50783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204226.54125: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 44109 1727204226.54187: stdout chunk (state=3): >>>import _imp # builtin <<< 44109 1727204226.54230: stdout chunk (state=3): >>>import '_thread' # <<< 44109 1727204226.54260: stdout chunk (state=3): >>>import '_warnings' # <<< 44109 1727204226.54268: stdout chunk (state=3): >>>import '_weakref' # <<< 44109 1727204226.54378: stdout chunk (state=3): >>>import '_io' # <<< 44109 1727204226.54400: stdout chunk (state=3): >>>import 'marshal' # <<< 44109 1727204226.54465: stdout chunk (state=3): >>> import 'posix' # <<< 44109 1727204226.54471: stdout chunk (state=3): >>> <<< 44109 1727204226.54522: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 44109 1727204226.54538: stdout chunk (state=3): >>> # installing zipimport hook <<< 44109 1727204226.54570: stdout chunk (state=3): >>>import 'time' # <<< 44109 1727204226.54590: stdout chunk (state=3): >>> import 'zipimport' # <<< 44109 1727204226.54605: stdout chunk (state=3): >>> <<< 44109 1727204226.54685: stdout chunk (state=3): >>># installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 44109 1727204226.54691: stdout chunk (state=3): >>> <<< 44109 1727204226.54713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204226.54751: stdout chunk (state=3): >>>import '_codecs' # <<< 44109 1727204226.54756: stdout chunk (state=3): >>> <<< 44109 1727204226.54795: stdout chunk (state=3): >>>import 'codecs' # <<< 44109 1727204226.54861: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 44109 1727204226.54864: stdout chunk (state=3): >>> <<< 44109 1727204226.54909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 44109 1727204226.54914: stdout chunk (state=3): >>> <<< 44109 1727204226.54939: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353c184d0><<< 44109 1727204226.54954: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353be7b30><<< 44109 1727204226.54988: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 44109 1727204226.55003: stdout chunk (state=3): >>> <<< 44109 1727204226.55009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 44109 1727204226.55028: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353c1aa50><<< 44109 1727204226.55068: stdout chunk (state=3): >>> import '_signal' # <<< 44109 1727204226.55110: stdout chunk (state=3): >>>import '_abc' # <<< 44109 1727204226.55116: stdout chunk (state=3): >>> <<< 44109 1727204226.55166: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 44109 1727204226.55168: stdout chunk (state=3): >>> <<< 44109 1727204226.55223: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 44109 1727204226.55291: stdout chunk (state=3): >>> <<< 44109 1727204226.55380: stdout chunk (state=3): >>>import '_collections_abc' # <<< 44109 1727204226.55384: stdout chunk (state=3): >>> <<< 44109 1727204226.55429: stdout chunk (state=3): >>>import 'genericpath' # <<< 44109 1727204226.55450: stdout chunk (state=3): >>>import 'posixpath' # <<< 44109 1727204226.55453: stdout chunk (state=3): >>> <<< 44109 1727204226.55504: stdout chunk (state=3): >>>import 'os' # <<< 44109 1727204226.55532: stdout chunk (state=3): >>> import '_sitebuiltins' # <<< 44109 1727204226.55536: stdout chunk (state=3): >>> <<< 44109 1727204226.55571: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages<<< 44109 1727204226.55602: stdout chunk (state=3): >>> Adding directory: '/usr/local/lib/python3.12/site-packages'<<< 44109 1727204226.55616: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages'<<< 44109 1727204226.55632: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 44109 1727204226.55650: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 44109 1727204226.55704: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 44109 1727204226.55729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 44109 1727204226.55796: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a09130> <<< 44109 1727204226.55863: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 44109 1727204226.55891: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204226.55955: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a0a060> import 'site' # <<< 44109 1727204226.56003: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 44109 1727204226.56008: stdout chunk (state=3): >>> <<< 44109 1727204226.56096: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 44109 1727204226.56409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 44109 1727204226.56414: stdout chunk (state=3): >>> <<< 44109 1727204226.56443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 44109 1727204226.56447: stdout chunk (state=3): >>> <<< 44109 1727204226.56483: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 44109 1727204226.56487: stdout chunk (state=3): >>> <<< 44109 1727204226.56513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 44109 1727204226.56516: stdout chunk (state=3): >>> <<< 44109 1727204226.56541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 44109 1727204226.56620: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 44109 1727204226.56624: stdout chunk (state=3): >>> <<< 44109 1727204226.56658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 44109 1727204226.56712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 44109 1727204226.56715: stdout chunk (state=3): >>> <<< 44109 1727204226.56805: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a47f20> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 44109 1727204226.56830: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a5c0b0><<< 44109 1727204226.56869: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 44109 1727204226.56953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 44109 1727204226.56956: stdout chunk (state=3): >>> <<< 44109 1727204226.57045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204226.57122: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 44109 1727204226.57131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 44109 1727204226.57136: stdout chunk (state=3): >>> import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a7f950><<< 44109 1727204226.57171: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 44109 1727204226.57204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 44109 1727204226.57220: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a7ffe0><<< 44109 1727204226.57243: stdout chunk (state=3): >>> import '_collections' # <<< 44109 1727204226.57291: stdout chunk (state=3): >>> <<< 44109 1727204226.57337: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a5fbf0> <<< 44109 1727204226.57366: stdout chunk (state=3): >>>import '_functools' # <<< 44109 1727204226.57431: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a5d310><<< 44109 1727204226.57579: stdout chunk (state=3): >>> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a450d0><<< 44109 1727204226.57586: stdout chunk (state=3): >>> <<< 44109 1727204226.57632: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 44109 1727204226.57643: stdout chunk (state=3): >>> <<< 44109 1727204226.57672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc'<<< 44109 1727204226.57678: stdout chunk (state=3): >>> <<< 44109 1727204226.57706: stdout chunk (state=3): >>>import '_sre' # <<< 44109 1727204226.57713: stdout chunk (state=3): >>> <<< 44109 1727204226.57753: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 44109 1727204226.57767: stdout chunk (state=3): >>> <<< 44109 1727204226.57810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 44109 1727204226.57815: stdout chunk (state=3): >>> <<< 44109 1727204226.57850: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 44109 1727204226.57857: stdout chunk (state=3): >>> <<< 44109 1727204226.57881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 44109 1727204226.57885: stdout chunk (state=3): >>> <<< 44109 1727204226.57934: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aa3890><<< 44109 1727204226.57974: stdout chunk (state=3): >>> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aa24b0> <<< 44109 1727204226.58022: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 44109 1727204226.58093: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a5e1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aa0ce0> <<< 44109 1727204226.58120: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 44109 1727204226.58128: stdout chunk (state=3): >>> <<< 44109 1727204226.58148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 44109 1727204226.58153: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad08f0><<< 44109 1727204226.58178: stdout chunk (state=3): >>> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a44350><<< 44109 1727204226.58215: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 44109 1727204226.58221: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 44109 1727204226.58266: stdout chunk (state=3): >>> # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.58271: stdout chunk (state=3): >>> <<< 44109 1727204226.58296: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.58299: stdout chunk (state=3): >>> import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353ad0da0><<< 44109 1727204226.58354: stdout chunk (state=3): >>> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad0c50> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.58384: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.58389: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353ad1010> <<< 44109 1727204226.58414: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a42e70><<< 44109 1727204226.58453: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 44109 1727204226.58461: stdout chunk (state=3): >>> <<< 44109 1727204226.58479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 44109 1727204226.58484: stdout chunk (state=3): >>> <<< 44109 1727204226.58516: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 44109 1727204226.58562: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 44109 1727204226.58591: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad16a0> <<< 44109 1727204226.58605: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad13a0><<< 44109 1727204226.58630: stdout chunk (state=3): >>> import 'importlib.machinery' # <<< 44109 1727204226.58670: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 44109 1727204226.58710: stdout chunk (state=3): >>> import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad25a0> <<< 44109 1727204226.58740: stdout chunk (state=3): >>>import 'importlib.util' # <<< 44109 1727204226.58749: stdout chunk (state=3): >>> import 'runpy' # <<< 44109 1727204226.58790: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 44109 1727204226.58794: stdout chunk (state=3): >>> <<< 44109 1727204226.58856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 44109 1727204226.58888: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py<<< 44109 1727204226.58902: stdout chunk (state=3): >>> <<< 44109 1727204226.58916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc'<<< 44109 1727204226.58932: stdout chunk (state=3): >>> import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aec7d0><<< 44109 1727204226.58953: stdout chunk (state=3): >>> import 'errno' # <<< 44109 1727204226.58996: stdout chunk (state=3): >>> # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.58999: stdout chunk (state=3): >>> <<< 44109 1727204226.59026: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.59029: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353aedf10><<< 44109 1727204226.59053: stdout chunk (state=3): >>> <<< 44109 1727204226.59084: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 44109 1727204226.59106: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 44109 1727204226.59139: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 44109 1727204226.59147: stdout chunk (state=3): >>> <<< 44109 1727204226.59167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 44109 1727204226.59184: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aeedb0><<< 44109 1727204226.59236: stdout chunk (state=3): >>> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.59258: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.59274: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353aef410><<< 44109 1727204226.59301: stdout chunk (state=3): >>> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aee300><<< 44109 1727204226.59306: stdout chunk (state=3): >>> <<< 44109 1727204226.59354: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 44109 1727204226.59360: stdout chunk (state=3): >>> <<< 44109 1727204226.59426: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.59451: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353aefe90> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aef5c0> <<< 44109 1727204226.59553: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad2600> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 44109 1727204226.59557: stdout chunk (state=3): >>> <<< 44109 1727204226.59630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 44109 1727204226.59670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 44109 1727204226.59739: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.59742: stdout chunk (state=3): >>> <<< 44109 1727204226.59745: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.59747: stdout chunk (state=3): >>> <<< 44109 1727204226.59758: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff35387bda0><<< 44109 1727204226.59786: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py<<< 44109 1727204226.59791: stdout chunk (state=3): >>> <<< 44109 1727204226.59840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.59843: stdout chunk (state=3): >>> <<< 44109 1727204226.59859: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3538a48c0><<< 44109 1727204226.59908: stdout chunk (state=3): >>> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a4620> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.59921: stdout chunk (state=3): >>> <<< 44109 1727204226.59932: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.59979: stdout chunk (state=3): >>> import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3538a48f0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py<<< 44109 1727204226.59987: stdout chunk (state=3): >>> <<< 44109 1727204226.60006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 44109 1727204226.60011: stdout chunk (state=3): >>> <<< 44109 1727204226.60124: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.60197: stdout chunk (state=3): >>> <<< 44109 1727204226.60340: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.60346: stdout chunk (state=3): >>> <<< 44109 1727204226.60499: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3538a5220> <<< 44109 1727204226.60585: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.60607: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.60624: stdout chunk (state=3): >>> import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3538a5c10> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a4ad0><<< 44109 1727204226.60658: stdout chunk (state=3): >>> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353879f40><<< 44109 1727204226.60663: stdout chunk (state=3): >>> <<< 44109 1727204226.60700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 44109 1727204226.60732: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 44109 1727204226.60737: stdout chunk (state=3): >>> <<< 44109 1727204226.60787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py<<< 44109 1727204226.60824: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a7020><<< 44109 1727204226.60858: stdout chunk (state=3): >>> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a5d60><<< 44109 1727204226.60864: stdout chunk (state=3): >>> <<< 44109 1727204226.60893: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad2cf0><<< 44109 1727204226.60898: stdout chunk (state=3): >>> <<< 44109 1727204226.60932: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 44109 1727204226.60996: stdout chunk (state=3): >>> <<< 44109 1727204226.61041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc'<<< 44109 1727204226.61045: stdout chunk (state=3): >>> <<< 44109 1727204226.61077: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 44109 1727204226.61082: stdout chunk (state=3): >>> <<< 44109 1727204226.61148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 44109 1727204226.61154: stdout chunk (state=3): >>> <<< 44109 1727204226.61272: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538cf380> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 44109 1727204226.61305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 44109 1727204226.61320: stdout chunk (state=3): >>> <<< 44109 1727204226.61355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py<<< 44109 1727204226.61360: stdout chunk (state=3): >>> <<< 44109 1727204226.61402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 44109 1727204226.61407: stdout chunk (state=3): >>> <<< 44109 1727204226.61465: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538f36e0><<< 44109 1727204226.61508: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 44109 1727204226.61591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 44109 1727204226.61687: stdout chunk (state=3): >>> import 'ntpath' # <<< 44109 1727204226.61693: stdout chunk (state=3): >>> <<< 44109 1727204226.61734: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 44109 1727204226.61745: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 44109 1727204226.61761: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353954500><<< 44109 1727204226.61771: stdout chunk (state=3): >>> <<< 44109 1727204226.61805: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 44109 1727204226.61810: stdout chunk (state=3): >>> <<< 44109 1727204226.61896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 44109 1727204226.61902: stdout chunk (state=3): >>> <<< 44109 1727204226.61969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 44109 1727204226.61974: stdout chunk (state=3): >>> <<< 44109 1727204226.62209: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353956c60> <<< 44109 1727204226.62264: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353954620> <<< 44109 1727204226.62352: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35391d4f0><<< 44109 1727204226.62372: stdout chunk (state=3): >>> <<< 44109 1727204226.62413: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 44109 1727204226.62433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537594f0><<< 44109 1727204226.62490: stdout chunk (state=3): >>> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538f24e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a7f50><<< 44109 1727204226.62826: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff3538f2840> <<< 44109 1727204226.62946: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_mhuyt_qb/ansible_stat_payload.zip' # zipimport: zlib available<<< 44109 1727204226.62967: stdout chunk (state=3): >>> <<< 44109 1727204226.63182: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.63209: stdout chunk (state=3): >>> <<< 44109 1727204226.63235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 44109 1727204226.63274: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 44109 1727204226.63296: stdout chunk (state=3): >>> <<< 44109 1727204226.63367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 44109 1727204226.63385: stdout chunk (state=3): >>> <<< 44109 1727204226.63535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 44109 1727204226.63593: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 44109 1727204226.63597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 44109 1727204226.63616: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537ab260> import '_typing' # <<< 44109 1727204226.63905: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35378e150> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35378d2e0><<< 44109 1727204226.63933: stdout chunk (state=3): >>> # zipimport: zlib available<<< 44109 1727204226.63973: stdout chunk (state=3): >>> import 'ansible' # <<< 44109 1727204226.63980: stdout chunk (state=3): >>> <<< 44109 1727204226.64001: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.64040: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204226.64065: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.64088: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 44109 1727204226.64129: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204226.66686: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.66707: stdout chunk (state=3): >>> <<< 44109 1727204226.68578: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537a9100> <<< 44109 1727204226.68628: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 44109 1727204226.68653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204226.68794: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 44109 1727204226.68798: stdout chunk (state=3): >>> <<< 44109 1727204226.68896: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3537d6bd0><<< 44109 1727204226.68961: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537d6960> <<< 44109 1727204226.69027: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537d6270> <<< 44109 1727204226.69342: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537d6cc0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537abc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3537d7980> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3537d7bc0> <<< 44109 1727204226.69433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 44109 1727204226.69479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 44109 1727204226.69490: stdout chunk (state=3): >>>import '_locale' # <<< 44109 1727204226.69581: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537fc0b0><<< 44109 1727204226.69669: stdout chunk (state=3): >>> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 44109 1727204226.69700: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353111e80><<< 44109 1727204226.69730: stdout chunk (state=3): >>> <<< 44109 1727204226.69749: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.69755: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.69894: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353113aa0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 44109 1727204226.69899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353118470> <<< 44109 1727204226.69945: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 44109 1727204226.69948: stdout chunk (state=3): >>> <<< 44109 1727204226.69995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 44109 1727204226.70017: stdout chunk (state=3): >>> <<< 44109 1727204226.70124: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353119370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 44109 1727204226.70160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 44109 1727204226.70239: stdout chunk (state=3): >>> <<< 44109 1727204226.70265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3531200b0><<< 44109 1727204226.70290: stdout chunk (state=3): >>> <<< 44109 1727204226.70347: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.70374: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353120410> <<< 44109 1727204226.70459: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35311a360> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 44109 1727204226.70555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 44109 1727204226.70603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 44109 1727204226.70713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 44109 1727204226.70732: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353123f50><<< 44109 1727204226.70765: stdout chunk (state=3): >>> import '_tokenize' # <<< 44109 1727204226.70919: stdout chunk (state=3): >>> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353122a20> <<< 44109 1727204226.70935: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353122780> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 44109 1727204226.70959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 44109 1727204226.71208: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353122cf0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35311a870> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.71249: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353167fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 44109 1727204226.71278: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3531682f0><<< 44109 1727204226.71328: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 44109 1727204226.71417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 44109 1727204226.71447: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 44109 1727204226.71495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.71534: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353169dc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353169b80><<< 44109 1727204226.71636: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 44109 1727204226.71825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 44109 1727204226.71907: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.71940: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff35316c320> <<< 44109 1727204226.71969: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35316a4b0> <<< 44109 1727204226.72074: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204226.72102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 44109 1727204226.72131: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 44109 1727204226.72230: stdout chunk (state=3): >>> import '_string' # <<< 44109 1727204226.72253: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35316fa40> <<< 44109 1727204226.72533: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35316c410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.72562: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.72623: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353170ad0> <<< 44109 1727204226.72653: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.72749: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353170c50> <<< 44109 1727204226.72776: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353170c20><<< 44109 1727204226.72800: stdout chunk (state=3): >>> <<< 44109 1727204226.72865: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3531684a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 44109 1727204226.72916: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 44109 1727204226.72931: stdout chunk (state=3): >>> <<< 44109 1727204226.73048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.73068: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.73199: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3531fc3b0> <<< 44109 1727204226.73327: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 44109 1727204226.73498: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.73516: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3531fd760> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353172b40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353173a10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353172780> <<< 44109 1727204226.73539: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.73579: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 44109 1727204226.73620: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204226.73915: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 44109 1727204226.73954: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204226.74038: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 44109 1727204226.74078: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 44109 1727204226.74091: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.74285: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.74522: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.75454: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.75604: stdout chunk (state=3): >>> <<< 44109 1727204226.76479: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 44109 1727204226.76521: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 44109 1727204226.76542: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # <<< 44109 1727204226.76608: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 44109 1727204226.76631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204226.76737: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 44109 1727204226.76752: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353001a00><<< 44109 1727204226.76980: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 44109 1727204226.77000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353002780> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3531fd880> <<< 44109 1727204226.77066: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 44109 1727204226.77102: stdout chunk (state=3): >>> <<< 44109 1727204226.77106: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.77299: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 44109 1727204226.77699: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.77750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 44109 1727204226.77768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 44109 1727204226.77800: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3530027e0> <<< 44109 1727204226.77834: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.78641: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.78657: stdout chunk (state=3): >>> <<< 44109 1727204226.79457: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.79713: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # <<< 44109 1727204226.79789: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204226.79843: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 44109 1727204226.79867: stdout chunk (state=3): >>> <<< 44109 1727204226.79885: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.80020: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204226.80221: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 44109 1727204226.80238: stdout chunk (state=3): >>> <<< 44109 1727204226.80546: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.80549: stdout chunk (state=3): >>> <<< 44109 1727204226.80551: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.80553: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 44109 1727204226.80995: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.81217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 44109 1727204226.81299: stdout chunk (state=3): >>> <<< 44109 1727204226.81336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc'<<< 44109 1727204226.81347: stdout chunk (state=3): >>> <<< 44109 1727204226.81373: stdout chunk (state=3): >>>import '_ast' # <<< 44109 1727204226.81419: stdout chunk (state=3): >>> <<< 44109 1727204226.81527: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353003920> <<< 44109 1727204226.81545: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.81699: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.81799: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 44109 1727204226.81823: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 44109 1727204226.81844: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # <<< 44109 1727204226.81895: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44109 1727204226.81953: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.82023: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 44109 1727204226.82047: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.82142: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.82308: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204226.82458: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 44109 1727204226.82781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 44109 1727204226.82824: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff35300e1b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35300be30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 44109 1727204226.82847: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.82945: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.83084: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44109 1727204226.83163: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 44109 1727204226.83188: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 44109 1727204226.83215: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 44109 1727204226.83259: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 44109 1727204226.83280: stdout chunk (state=3): >>> <<< 44109 1727204226.83319: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 44109 1727204226.83428: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 44109 1727204226.83492: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 44109 1727204226.83557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 44109 1727204226.83706: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35382a990> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35381e690> <<< 44109 1727204226.83823: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35300dfd0><<< 44109 1727204226.83848: stdout chunk (state=3): >>> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353170e90> # destroy ansible.module_utils.distro<<< 44109 1727204226.83869: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 44109 1727204226.84139: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # <<< 44109 1727204226.84142: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 44109 1727204226.84145: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available<<< 44109 1727204226.84164: stdout chunk (state=3): >>> import 'ansible.modules' # <<< 44109 1727204226.84187: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.84214: stdout chunk (state=3): >>> <<< 44109 1727204226.84455: stdout chunk (state=3): >>># zipimport: zlib available<<< 44109 1727204226.84651: stdout chunk (state=3): >>> <<< 44109 1727204226.84834: stdout chunk (state=3): >>># zipimport: zlib available <<< 44109 1727204226.85030: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 44109 1727204226.85067: stdout chunk (state=3): >>># destroy __main__ <<< 44109 1727204226.85626: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 44109 1727204226.85669: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ <<< 44109 1727204226.85740: stdout chunk (state=3): >>># clear sys.path # clear sys.argv<<< 44109 1727204226.85743: stdout chunk (state=3): >>> # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__<<< 44109 1727204226.85882: stdout chunk (state=3): >>> # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator<<< 44109 1727204226.85908: stdout chunk (state=3): >>> # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2<<< 44109 1727204226.85968: stdout chunk (state=3): >>> # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json<<< 44109 1727204226.86029: stdout chunk (state=3): >>> # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime<<< 44109 1727204226.86034: stdout chunk (state=3): >>> # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat <<< 44109 1727204226.86081: stdout chunk (state=3): >>># destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text<<< 44109 1727204226.86090: stdout chunk (state=3): >>> # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 44109 1727204226.86119: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4<<< 44109 1727204226.86154: stdout chunk (state=3): >>> # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils<<< 44109 1727204226.86200: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 44109 1727204226.86737: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 44109 1727204226.86764: stdout chunk (state=3): >>># destroy importlib.machinery<<< 44109 1727204226.86814: stdout chunk (state=3): >>> # destroy importlib._abc # destroy importlib.util # destroy _bz2<<< 44109 1727204226.86818: stdout chunk (state=3): >>> # destroy _compression<<< 44109 1727204226.86849: stdout chunk (state=3): >>> # destroy _lzma # destroy _blake2 # destroy binascii<<< 44109 1727204226.86874: stdout chunk (state=3): >>> # destroy struct # destroy zlib # destroy bz2<<< 44109 1727204226.86897: stdout chunk (state=3): >>> # destroy lzma # destroy zipfile._path # destroy zipfile <<< 44109 1727204226.86926: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress<<< 44109 1727204226.86966: stdout chunk (state=3): >>> # destroy ntpath<<< 44109 1727204226.87001: stdout chunk (state=3): >>> # destroy importlib <<< 44109 1727204226.87037: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder<<< 44109 1727204226.87062: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 44109 1727204226.87099: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale<<< 44109 1727204226.87134: stdout chunk (state=3): >>> # destroy signal # destroy fcntl # destroy select # destroy _signal<<< 44109 1727204226.87166: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog<<< 44109 1727204226.87185: stdout chunk (state=3): >>> # destroy uuid # destroy selectors # destroy errno<<< 44109 1727204226.87207: stdout chunk (state=3): >>> # destroy array <<< 44109 1727204226.87254: stdout chunk (state=3): >>># destroy datetime # destroy selinux # destroy shutil<<< 44109 1727204226.87304: stdout chunk (state=3): >>> # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 44109 1727204226.87385: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 44109 1727204226.87406: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 44109 1727204226.87430: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 44109 1727204226.87473: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 44109 1727204226.87505: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 44109 1727204226.87527: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 44109 1727204226.87578: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 44109 1727204226.87630: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 44109 1727204226.87740: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs<<< 44109 1727204226.87807: stdout chunk (state=3): >>> # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external<<< 44109 1727204226.87882: stdout chunk (state=3): >>> # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 44109 1727204226.88059: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket<<< 44109 1727204226.88086: stdout chunk (state=3): >>> <<< 44109 1727204226.88162: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid <<< 44109 1727204226.88189: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize<<< 44109 1727204226.88337: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib # destroy copyreg<<< 44109 1727204226.88341: stdout chunk (state=3): >>> # destroy contextlib # destroy _typing <<< 44109 1727204226.88374: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal<<< 44109 1727204226.88483: stdout chunk (state=3): >>> # clear sys.meta_path # clear sys.modules<<< 44109 1727204226.88630: stdout chunk (state=3): >>> # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs<<< 44109 1727204226.88656: stdout chunk (state=3): >>> # destroy io # destroy traceback<<< 44109 1727204226.88737: stdout chunk (state=3): >>> # destroy warnings # destroy weakref<<< 44109 1727204226.88758: stdout chunk (state=3): >>> # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect<<< 44109 1727204226.88847: stdout chunk (state=3): >>> # destroy time<<< 44109 1727204226.88863: stdout chunk (state=3): >>> # destroy _random # destroy _weakref # destroy _hashlib <<< 44109 1727204226.88884: stdout chunk (state=3): >>># destroy _operator<<< 44109 1727204226.88930: stdout chunk (state=3): >>> # destroy _string # destroy re # destroy itertools <<< 44109 1727204226.89036: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 44109 1727204226.89520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204226.89574: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 44109 1727204226.89587: stdout chunk (state=3): >>><<< 44109 1727204226.89606: stderr chunk (state=3): >>><<< 44109 1727204226.89721: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a09130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a0a060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a47f20> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a5c0b0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a7f950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a7ffe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a5fbf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a5d310> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a450d0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aa3890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aa24b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a5e1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aa0ce0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad08f0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a44350> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353ad0da0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad0c50> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353ad1010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353a42e70> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad16a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad13a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad25a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aec7d0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353aedf10> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aeedb0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353aef410> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aee300> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353aefe90> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353aef5c0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad2600> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff35387bda0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3538a48c0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a4620> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3538a48f0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3538a5220> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3538a5c10> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a4ad0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353879f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a7020> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a5d60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353ad2cf0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538cf380> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538f36e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353954500> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353956c60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353954620> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35391d4f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537594f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538f24e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3538a7f50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff3538f2840> # zipimport: found 30 names in '/tmp/ansible_stat_payload_mhuyt_qb/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537ab260> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35378e150> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35378d2e0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537a9100> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3537d6bd0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537d6960> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537d6270> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537d6cc0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537abc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3537d7980> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3537d7bc0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3537fc0b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353111e80> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353113aa0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353118470> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353119370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3531200b0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353120410> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35311a360> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353123f50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353122a20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353122780> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353122cf0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35311a870> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353167fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3531682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353169dc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353169b80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff35316c320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35316a4b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35316fa40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35316c410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353170ad0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353170c50> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353170c20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3531684a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3531fc3b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3531fd760> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353172b40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353173a10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353172780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff353001a00> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353002780> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3531fd880> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3530027e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353003920> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff35300e1b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35300be30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35382a990> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35381e690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff35300dfd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff353170e90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 44109 1727204226.90448: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204226.90451: _low_level_execute_command(): starting 44109 1727204226.90454: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204226.234983-44392-189388378635265/ > /dev/null 2>&1 && sleep 0' 44109 1727204226.90812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204226.90903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204226.90907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204226.90910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204226.90913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204226.90915: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204226.90917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204226.90919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204226.90947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204226.91058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204226.93884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204226.93907: stderr chunk (state=3): >>><<< 44109 1727204226.93916: stdout chunk (state=3): >>><<< 44109 1727204226.93939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204226.93950: handler run complete 44109 1727204226.93978: attempt loop complete, returning result 44109 1727204226.94084: _execute() done 44109 1727204226.94088: dumping result to json 44109 1727204226.94090: done dumping result, returning 44109 1727204226.94092: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [028d2410-947f-ed67-a560-0000000000c2] 44109 1727204226.94094: sending task result for task 028d2410-947f-ed67-a560-0000000000c2 44109 1727204226.94166: done sending task result for task 028d2410-947f-ed67-a560-0000000000c2 44109 1727204226.94170: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 44109 1727204226.94247: no more pending results, returning what we have 44109 1727204226.94251: results queue empty 44109 1727204226.94252: checking for any_errors_fatal 44109 1727204226.94257: done checking for any_errors_fatal 44109 1727204226.94258: checking for max_fail_percentage 44109 1727204226.94259: done checking for max_fail_percentage 44109 1727204226.94260: checking to see if all hosts have failed and the running result is not ok 44109 1727204226.94260: done checking to see if all hosts have failed 44109 1727204226.94261: getting the remaining hosts for this loop 44109 1727204226.94262: done getting the remaining hosts for this loop 44109 1727204226.94266: getting the next task for host managed-node1 44109 1727204226.94272: done getting next task for host managed-node1 44109 1727204226.94274: ^ task is: TASK: Set flag to indicate system is ostree 44109 1727204226.94278: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204226.94282: getting variables 44109 1727204226.94283: in VariableManager get_vars() 44109 1727204226.94394: Calling all_inventory to load vars for managed-node1 44109 1727204226.94396: Calling groups_inventory to load vars for managed-node1 44109 1727204226.94399: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204226.94417: Calling all_plugins_play to load vars for managed-node1 44109 1727204226.94420: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204226.94423: Calling groups_plugins_play to load vars for managed-node1 44109 1727204226.94611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204226.94791: done with get_vars() 44109 1727204226.94801: done getting variables 44109 1727204226.94894: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:57:06 -0400 (0:00:00.781) 0:00:03.745 ***** 44109 1727204226.94921: entering _queue_task() for managed-node1/set_fact 44109 1727204226.94923: Creating lock for set_fact 44109 1727204226.95388: worker is 1 (out of 1 available) 44109 1727204226.95395: exiting _queue_task() for managed-node1/set_fact 44109 1727204226.95403: done queuing things up, now waiting for results queue to drain 44109 1727204226.95404: waiting for pending results... 44109 1727204226.95496: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 44109 1727204226.95560: in run() - task 028d2410-947f-ed67-a560-0000000000c3 44109 1727204226.95584: variable 'ansible_search_path' from source: unknown 44109 1727204226.95594: variable 'ansible_search_path' from source: unknown 44109 1727204226.95640: calling self._execute() 44109 1727204226.95737: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204226.95741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204226.95743: variable 'omit' from source: magic vars 44109 1727204226.96235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204226.96456: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204226.96597: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204226.96600: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204226.96603: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204226.96662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204226.96693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204226.96728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204226.96758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204226.96884: Evaluated conditional (not __network_is_ostree is defined): True 44109 1727204226.96895: variable 'omit' from source: magic vars 44109 1727204226.96938: variable 'omit' from source: magic vars 44109 1727204226.97059: variable '__ostree_booted_stat' from source: set_fact 44109 1727204226.97117: variable 'omit' from source: magic vars 44109 1727204226.97150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204226.97185: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204226.97208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204226.97230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204226.97250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204226.97355: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204226.97358: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204226.97361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204226.97398: Set connection var ansible_connection to ssh 44109 1727204226.97411: Set connection var ansible_timeout to 10 44109 1727204226.97424: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204226.97438: Set connection var ansible_pipelining to False 44109 1727204226.97450: Set connection var ansible_shell_executable to /bin/sh 44109 1727204226.97466: Set connection var ansible_shell_type to sh 44109 1727204226.97498: variable 'ansible_shell_executable' from source: unknown 44109 1727204226.97509: variable 'ansible_connection' from source: unknown 44109 1727204226.97519: variable 'ansible_module_compression' from source: unknown 44109 1727204226.97527: variable 'ansible_shell_type' from source: unknown 44109 1727204226.97536: variable 'ansible_shell_executable' from source: unknown 44109 1727204226.97545: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204226.97554: variable 'ansible_pipelining' from source: unknown 44109 1727204226.97572: variable 'ansible_timeout' from source: unknown 44109 1727204226.97577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204226.97682: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204226.97790: variable 'omit' from source: magic vars 44109 1727204226.97793: starting attempt loop 44109 1727204226.97796: running the handler 44109 1727204226.97798: handler run complete 44109 1727204226.97800: attempt loop complete, returning result 44109 1727204226.97802: _execute() done 44109 1727204226.97804: dumping result to json 44109 1727204226.97806: done dumping result, returning 44109 1727204226.97808: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [028d2410-947f-ed67-a560-0000000000c3] 44109 1727204226.97810: sending task result for task 028d2410-947f-ed67-a560-0000000000c3 44109 1727204226.97880: done sending task result for task 028d2410-947f-ed67-a560-0000000000c3 44109 1727204226.97883: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 44109 1727204226.97951: no more pending results, returning what we have 44109 1727204226.97955: results queue empty 44109 1727204226.97956: checking for any_errors_fatal 44109 1727204226.97962: done checking for any_errors_fatal 44109 1727204226.97963: checking for max_fail_percentage 44109 1727204226.97965: done checking for max_fail_percentage 44109 1727204226.97965: checking to see if all hosts have failed and the running result is not ok 44109 1727204226.97966: done checking to see if all hosts have failed 44109 1727204226.97967: getting the remaining hosts for this loop 44109 1727204226.97968: done getting the remaining hosts for this loop 44109 1727204226.97972: getting the next task for host managed-node1 44109 1727204226.97988: done getting next task for host managed-node1 44109 1727204226.97992: ^ task is: TASK: Fix CentOS6 Base repo 44109 1727204226.97995: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204226.97999: getting variables 44109 1727204226.98001: in VariableManager get_vars() 44109 1727204226.98032: Calling all_inventory to load vars for managed-node1 44109 1727204226.98035: Calling groups_inventory to load vars for managed-node1 44109 1727204226.98039: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204226.98050: Calling all_plugins_play to load vars for managed-node1 44109 1727204226.98054: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204226.98066: Calling groups_plugins_play to load vars for managed-node1 44109 1727204226.98521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204226.98712: done with get_vars() 44109 1727204226.98721: done getting variables 44109 1727204226.98831: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:57:06 -0400 (0:00:00.039) 0:00:03.784 ***** 44109 1727204226.98857: entering _queue_task() for managed-node1/copy 44109 1727204226.99096: worker is 1 (out of 1 available) 44109 1727204226.99108: exiting _queue_task() for managed-node1/copy 44109 1727204226.99118: done queuing things up, now waiting for results queue to drain 44109 1727204226.99119: waiting for pending results... 44109 1727204226.99345: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 44109 1727204226.99445: in run() - task 028d2410-947f-ed67-a560-0000000000c5 44109 1727204226.99464: variable 'ansible_search_path' from source: unknown 44109 1727204226.99551: variable 'ansible_search_path' from source: unknown 44109 1727204226.99555: calling self._execute() 44109 1727204226.99582: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204226.99592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204226.99604: variable 'omit' from source: magic vars 44109 1727204227.00053: variable 'ansible_distribution' from source: facts 44109 1727204227.00080: Evaluated conditional (ansible_distribution == 'CentOS'): True 44109 1727204227.00184: variable 'ansible_distribution_major_version' from source: facts 44109 1727204227.00194: Evaluated conditional (ansible_distribution_major_version == '6'): False 44109 1727204227.00202: when evaluation is False, skipping this task 44109 1727204227.00211: _execute() done 44109 1727204227.00217: dumping result to json 44109 1727204227.00222: done dumping result, returning 44109 1727204227.00230: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [028d2410-947f-ed67-a560-0000000000c5] 44109 1727204227.00237: sending task result for task 028d2410-947f-ed67-a560-0000000000c5 44109 1727204227.00377: done sending task result for task 028d2410-947f-ed67-a560-0000000000c5 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 44109 1727204227.00539: no more pending results, returning what we have 44109 1727204227.00542: results queue empty 44109 1727204227.00543: checking for any_errors_fatal 44109 1727204227.00546: done checking for any_errors_fatal 44109 1727204227.00547: checking for max_fail_percentage 44109 1727204227.00548: done checking for max_fail_percentage 44109 1727204227.00549: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.00550: done checking to see if all hosts have failed 44109 1727204227.00550: getting the remaining hosts for this loop 44109 1727204227.00551: done getting the remaining hosts for this loop 44109 1727204227.00554: getting the next task for host managed-node1 44109 1727204227.00559: done getting next task for host managed-node1 44109 1727204227.00561: ^ task is: TASK: Include the task 'enable_epel.yml' 44109 1727204227.00563: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.00566: getting variables 44109 1727204227.00567: in VariableManager get_vars() 44109 1727204227.00591: Calling all_inventory to load vars for managed-node1 44109 1727204227.00594: Calling groups_inventory to load vars for managed-node1 44109 1727204227.00596: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.00604: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.00607: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.00610: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.00763: WORKER PROCESS EXITING 44109 1727204227.00789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.01007: done with get_vars() 44109 1727204227.01017: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.022) 0:00:03.807 ***** 44109 1727204227.01107: entering _queue_task() for managed-node1/include_tasks 44109 1727204227.01358: worker is 1 (out of 1 available) 44109 1727204227.01371: exiting _queue_task() for managed-node1/include_tasks 44109 1727204227.01383: done queuing things up, now waiting for results queue to drain 44109 1727204227.01384: waiting for pending results... 44109 1727204227.01625: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 44109 1727204227.01735: in run() - task 028d2410-947f-ed67-a560-0000000000c6 44109 1727204227.01754: variable 'ansible_search_path' from source: unknown 44109 1727204227.01762: variable 'ansible_search_path' from source: unknown 44109 1727204227.01805: calling self._execute() 44109 1727204227.01887: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.01898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.01913: variable 'omit' from source: magic vars 44109 1727204227.02402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204227.04542: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204227.04811: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204227.04857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204227.04896: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204227.04925: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204227.05007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204227.05041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204227.05079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204227.05125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204227.05143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204227.05259: variable '__network_is_ostree' from source: set_fact 44109 1727204227.05381: Evaluated conditional (not __network_is_ostree | d(false)): True 44109 1727204227.05385: _execute() done 44109 1727204227.05387: dumping result to json 44109 1727204227.05390: done dumping result, returning 44109 1727204227.05392: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [028d2410-947f-ed67-a560-0000000000c6] 44109 1727204227.05394: sending task result for task 028d2410-947f-ed67-a560-0000000000c6 44109 1727204227.05462: done sending task result for task 028d2410-947f-ed67-a560-0000000000c6 44109 1727204227.05465: WORKER PROCESS EXITING 44109 1727204227.05512: no more pending results, returning what we have 44109 1727204227.05518: in VariableManager get_vars() 44109 1727204227.05552: Calling all_inventory to load vars for managed-node1 44109 1727204227.05555: Calling groups_inventory to load vars for managed-node1 44109 1727204227.05559: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.05569: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.05573: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.05578: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.05949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.06159: done with get_vars() 44109 1727204227.06167: variable 'ansible_search_path' from source: unknown 44109 1727204227.06168: variable 'ansible_search_path' from source: unknown 44109 1727204227.06206: we have included files to process 44109 1727204227.06207: generating all_blocks data 44109 1727204227.06209: done generating all_blocks data 44109 1727204227.06214: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44109 1727204227.06216: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44109 1727204227.06218: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44109 1727204227.06891: done processing included file 44109 1727204227.06893: iterating over new_blocks loaded from include file 44109 1727204227.06895: in VariableManager get_vars() 44109 1727204227.06906: done with get_vars() 44109 1727204227.06908: filtering new block on tags 44109 1727204227.06929: done filtering new block on tags 44109 1727204227.06932: in VariableManager get_vars() 44109 1727204227.06943: done with get_vars() 44109 1727204227.06944: filtering new block on tags 44109 1727204227.06955: done filtering new block on tags 44109 1727204227.06957: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 44109 1727204227.06963: extending task lists for all hosts with included blocks 44109 1727204227.07066: done extending task lists 44109 1727204227.07067: done processing included files 44109 1727204227.07068: results queue empty 44109 1727204227.07069: checking for any_errors_fatal 44109 1727204227.07072: done checking for any_errors_fatal 44109 1727204227.07073: checking for max_fail_percentage 44109 1727204227.07074: done checking for max_fail_percentage 44109 1727204227.07076: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.07077: done checking to see if all hosts have failed 44109 1727204227.07078: getting the remaining hosts for this loop 44109 1727204227.07079: done getting the remaining hosts for this loop 44109 1727204227.07081: getting the next task for host managed-node1 44109 1727204227.07086: done getting next task for host managed-node1 44109 1727204227.07088: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 44109 1727204227.07090: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.07092: getting variables 44109 1727204227.07093: in VariableManager get_vars() 44109 1727204227.07101: Calling all_inventory to load vars for managed-node1 44109 1727204227.07104: Calling groups_inventory to load vars for managed-node1 44109 1727204227.07107: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.07112: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.07119: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.07123: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.07272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.07453: done with get_vars() 44109 1727204227.07461: done getting variables 44109 1727204227.07529: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 44109 1727204227.07727: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.066) 0:00:03.874 ***** 44109 1727204227.07771: entering _queue_task() for managed-node1/command 44109 1727204227.07773: Creating lock for command 44109 1727204227.08064: worker is 1 (out of 1 available) 44109 1727204227.08186: exiting _queue_task() for managed-node1/command 44109 1727204227.08196: done queuing things up, now waiting for results queue to drain 44109 1727204227.08197: waiting for pending results... 44109 1727204227.08393: running TaskExecutor() for managed-node1/TASK: Create EPEL 10 44109 1727204227.08452: in run() - task 028d2410-947f-ed67-a560-0000000000e0 44109 1727204227.08472: variable 'ansible_search_path' from source: unknown 44109 1727204227.08482: variable 'ansible_search_path' from source: unknown 44109 1727204227.08521: calling self._execute() 44109 1727204227.08599: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.08610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.08624: variable 'omit' from source: magic vars 44109 1727204227.09071: variable 'ansible_distribution' from source: facts 44109 1727204227.09076: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44109 1727204227.09137: variable 'ansible_distribution_major_version' from source: facts 44109 1727204227.09147: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 44109 1727204227.09154: when evaluation is False, skipping this task 44109 1727204227.09161: _execute() done 44109 1727204227.09168: dumping result to json 44109 1727204227.09174: done dumping result, returning 44109 1727204227.09192: done running TaskExecutor() for managed-node1/TASK: Create EPEL 10 [028d2410-947f-ed67-a560-0000000000e0] 44109 1727204227.09203: sending task result for task 028d2410-947f-ed67-a560-0000000000e0 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 44109 1727204227.09354: no more pending results, returning what we have 44109 1727204227.09358: results queue empty 44109 1727204227.09358: checking for any_errors_fatal 44109 1727204227.09360: done checking for any_errors_fatal 44109 1727204227.09360: checking for max_fail_percentage 44109 1727204227.09362: done checking for max_fail_percentage 44109 1727204227.09363: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.09364: done checking to see if all hosts have failed 44109 1727204227.09364: getting the remaining hosts for this loop 44109 1727204227.09366: done getting the remaining hosts for this loop 44109 1727204227.09370: getting the next task for host managed-node1 44109 1727204227.09379: done getting next task for host managed-node1 44109 1727204227.09381: ^ task is: TASK: Install yum-utils package 44109 1727204227.09385: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.09390: getting variables 44109 1727204227.09391: in VariableManager get_vars() 44109 1727204227.09420: Calling all_inventory to load vars for managed-node1 44109 1727204227.09423: Calling groups_inventory to load vars for managed-node1 44109 1727204227.09427: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.09439: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.09442: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.09444: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.09845: done sending task result for task 028d2410-947f-ed67-a560-0000000000e0 44109 1727204227.09848: WORKER PROCESS EXITING 44109 1727204227.09869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.10065: done with get_vars() 44109 1727204227.10074: done getting variables 44109 1727204227.10167: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.024) 0:00:03.898 ***** 44109 1727204227.10194: entering _queue_task() for managed-node1/package 44109 1727204227.10196: Creating lock for package 44109 1727204227.10438: worker is 1 (out of 1 available) 44109 1727204227.10450: exiting _queue_task() for managed-node1/package 44109 1727204227.10460: done queuing things up, now waiting for results queue to drain 44109 1727204227.10461: waiting for pending results... 44109 1727204227.10691: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 44109 1727204227.11083: in run() - task 028d2410-947f-ed67-a560-0000000000e1 44109 1727204227.11087: variable 'ansible_search_path' from source: unknown 44109 1727204227.11089: variable 'ansible_search_path' from source: unknown 44109 1727204227.11091: calling self._execute() 44109 1727204227.11684: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.11687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.11690: variable 'omit' from source: magic vars 44109 1727204227.12418: variable 'ansible_distribution' from source: facts 44109 1727204227.12459: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44109 1727204227.12784: variable 'ansible_distribution_major_version' from source: facts 44109 1727204227.12796: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 44109 1727204227.12805: when evaluation is False, skipping this task 44109 1727204227.12813: _execute() done 44109 1727204227.12820: dumping result to json 44109 1727204227.12827: done dumping result, returning 44109 1727204227.12838: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [028d2410-947f-ed67-a560-0000000000e1] 44109 1727204227.12847: sending task result for task 028d2410-947f-ed67-a560-0000000000e1 44109 1727204227.13081: done sending task result for task 028d2410-947f-ed67-a560-0000000000e1 44109 1727204227.13085: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 44109 1727204227.13670: no more pending results, returning what we have 44109 1727204227.13673: results queue empty 44109 1727204227.13673: checking for any_errors_fatal 44109 1727204227.13678: done checking for any_errors_fatal 44109 1727204227.13679: checking for max_fail_percentage 44109 1727204227.13680: done checking for max_fail_percentage 44109 1727204227.13681: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.13681: done checking to see if all hosts have failed 44109 1727204227.13682: getting the remaining hosts for this loop 44109 1727204227.13683: done getting the remaining hosts for this loop 44109 1727204227.13686: getting the next task for host managed-node1 44109 1727204227.13691: done getting next task for host managed-node1 44109 1727204227.13694: ^ task is: TASK: Enable EPEL 7 44109 1727204227.13698: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.13701: getting variables 44109 1727204227.13702: in VariableManager get_vars() 44109 1727204227.13724: Calling all_inventory to load vars for managed-node1 44109 1727204227.13727: Calling groups_inventory to load vars for managed-node1 44109 1727204227.13730: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.13739: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.13742: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.13745: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.14038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.14232: done with get_vars() 44109 1727204227.14241: done getting variables 44109 1727204227.14300: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.041) 0:00:03.939 ***** 44109 1727204227.14327: entering _queue_task() for managed-node1/command 44109 1727204227.14950: worker is 1 (out of 1 available) 44109 1727204227.14963: exiting _queue_task() for managed-node1/command 44109 1727204227.14977: done queuing things up, now waiting for results queue to drain 44109 1727204227.14978: waiting for pending results... 44109 1727204227.15423: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 44109 1727204227.15644: in run() - task 028d2410-947f-ed67-a560-0000000000e2 44109 1727204227.15659: variable 'ansible_search_path' from source: unknown 44109 1727204227.15662: variable 'ansible_search_path' from source: unknown 44109 1727204227.15802: calling self._execute() 44109 1727204227.15962: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.15969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.15980: variable 'omit' from source: magic vars 44109 1727204227.16957: variable 'ansible_distribution' from source: facts 44109 1727204227.16961: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44109 1727204227.16997: variable 'ansible_distribution_major_version' from source: facts 44109 1727204227.17073: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 44109 1727204227.17084: when evaluation is False, skipping this task 44109 1727204227.17091: _execute() done 44109 1727204227.17097: dumping result to json 44109 1727204227.17105: done dumping result, returning 44109 1727204227.17116: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [028d2410-947f-ed67-a560-0000000000e2] 44109 1727204227.17125: sending task result for task 028d2410-947f-ed67-a560-0000000000e2 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 44109 1727204227.17425: no more pending results, returning what we have 44109 1727204227.17429: results queue empty 44109 1727204227.17429: checking for any_errors_fatal 44109 1727204227.17439: done checking for any_errors_fatal 44109 1727204227.17440: checking for max_fail_percentage 44109 1727204227.17441: done checking for max_fail_percentage 44109 1727204227.17442: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.17443: done checking to see if all hosts have failed 44109 1727204227.17444: getting the remaining hosts for this loop 44109 1727204227.17445: done getting the remaining hosts for this loop 44109 1727204227.17449: getting the next task for host managed-node1 44109 1727204227.17456: done getting next task for host managed-node1 44109 1727204227.17459: ^ task is: TASK: Enable EPEL 8 44109 1727204227.17464: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.17468: getting variables 44109 1727204227.17469: in VariableManager get_vars() 44109 1727204227.17502: Calling all_inventory to load vars for managed-node1 44109 1727204227.17507: Calling groups_inventory to load vars for managed-node1 44109 1727204227.17510: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.17523: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.17527: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.17529: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.18100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.18369: done with get_vars() 44109 1727204227.18381: done getting variables 44109 1727204227.18583: done sending task result for task 028d2410-947f-ed67-a560-0000000000e2 44109 1727204227.18586: WORKER PROCESS EXITING 44109 1727204227.18620: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.043) 0:00:03.982 ***** 44109 1727204227.18648: entering _queue_task() for managed-node1/command 44109 1727204227.19121: worker is 1 (out of 1 available) 44109 1727204227.19131: exiting _queue_task() for managed-node1/command 44109 1727204227.19141: done queuing things up, now waiting for results queue to drain 44109 1727204227.19142: waiting for pending results... 44109 1727204227.19539: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 44109 1727204227.19656: in run() - task 028d2410-947f-ed67-a560-0000000000e3 44109 1727204227.19674: variable 'ansible_search_path' from source: unknown 44109 1727204227.19685: variable 'ansible_search_path' from source: unknown 44109 1727204227.19725: calling self._execute() 44109 1727204227.19806: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.19819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.19832: variable 'omit' from source: magic vars 44109 1727204227.20198: variable 'ansible_distribution' from source: facts 44109 1727204227.20213: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44109 1727204227.20347: variable 'ansible_distribution_major_version' from source: facts 44109 1727204227.20363: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 44109 1727204227.20371: when evaluation is False, skipping this task 44109 1727204227.20379: _execute() done 44109 1727204227.20385: dumping result to json 44109 1727204227.20392: done dumping result, returning 44109 1727204227.20401: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [028d2410-947f-ed67-a560-0000000000e3] 44109 1727204227.20409: sending task result for task 028d2410-947f-ed67-a560-0000000000e3 44109 1727204227.20619: done sending task result for task 028d2410-947f-ed67-a560-0000000000e3 44109 1727204227.20622: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 44109 1727204227.20668: no more pending results, returning what we have 44109 1727204227.20671: results queue empty 44109 1727204227.20672: checking for any_errors_fatal 44109 1727204227.20681: done checking for any_errors_fatal 44109 1727204227.20681: checking for max_fail_percentage 44109 1727204227.20683: done checking for max_fail_percentage 44109 1727204227.20684: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.20685: done checking to see if all hosts have failed 44109 1727204227.20685: getting the remaining hosts for this loop 44109 1727204227.20687: done getting the remaining hosts for this loop 44109 1727204227.20690: getting the next task for host managed-node1 44109 1727204227.20699: done getting next task for host managed-node1 44109 1727204227.20702: ^ task is: TASK: Enable EPEL 6 44109 1727204227.20707: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.20711: getting variables 44109 1727204227.20712: in VariableManager get_vars() 44109 1727204227.20741: Calling all_inventory to load vars for managed-node1 44109 1727204227.20744: Calling groups_inventory to load vars for managed-node1 44109 1727204227.20748: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.20760: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.20763: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.20766: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.21032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.21216: done with get_vars() 44109 1727204227.21225: done getting variables 44109 1727204227.21283: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.026) 0:00:04.009 ***** 44109 1727204227.21308: entering _queue_task() for managed-node1/copy 44109 1727204227.21523: worker is 1 (out of 1 available) 44109 1727204227.21536: exiting _queue_task() for managed-node1/copy 44109 1727204227.21548: done queuing things up, now waiting for results queue to drain 44109 1727204227.21549: waiting for pending results... 44109 1727204227.21806: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 44109 1727204227.21983: in run() - task 028d2410-947f-ed67-a560-0000000000e5 44109 1727204227.21987: variable 'ansible_search_path' from source: unknown 44109 1727204227.21989: variable 'ansible_search_path' from source: unknown 44109 1727204227.21998: calling self._execute() 44109 1727204227.22074: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.22090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.22109: variable 'omit' from source: magic vars 44109 1727204227.22495: variable 'ansible_distribution' from source: facts 44109 1727204227.22513: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44109 1727204227.22650: variable 'ansible_distribution_major_version' from source: facts 44109 1727204227.22653: Evaluated conditional (ansible_distribution_major_version == '6'): False 44109 1727204227.22756: when evaluation is False, skipping this task 44109 1727204227.22760: _execute() done 44109 1727204227.22762: dumping result to json 44109 1727204227.22764: done dumping result, returning 44109 1727204227.22767: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [028d2410-947f-ed67-a560-0000000000e5] 44109 1727204227.22769: sending task result for task 028d2410-947f-ed67-a560-0000000000e5 44109 1727204227.22838: done sending task result for task 028d2410-947f-ed67-a560-0000000000e5 44109 1727204227.22841: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 44109 1727204227.22890: no more pending results, returning what we have 44109 1727204227.22894: results queue empty 44109 1727204227.22895: checking for any_errors_fatal 44109 1727204227.22899: done checking for any_errors_fatal 44109 1727204227.22900: checking for max_fail_percentage 44109 1727204227.22902: done checking for max_fail_percentage 44109 1727204227.22902: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.22903: done checking to see if all hosts have failed 44109 1727204227.22904: getting the remaining hosts for this loop 44109 1727204227.22905: done getting the remaining hosts for this loop 44109 1727204227.22909: getting the next task for host managed-node1 44109 1727204227.22917: done getting next task for host managed-node1 44109 1727204227.22920: ^ task is: TASK: Set network provider to 'nm' 44109 1727204227.22923: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.22927: getting variables 44109 1727204227.22928: in VariableManager get_vars() 44109 1727204227.22959: Calling all_inventory to load vars for managed-node1 44109 1727204227.22962: Calling groups_inventory to load vars for managed-node1 44109 1727204227.22966: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.22980: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.22983: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.22987: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.23364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.23543: done with get_vars() 44109 1727204227.23552: done getting variables 44109 1727204227.23610: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:13 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.023) 0:00:04.032 ***** 44109 1727204227.23635: entering _queue_task() for managed-node1/set_fact 44109 1727204227.23857: worker is 1 (out of 1 available) 44109 1727204227.23868: exiting _queue_task() for managed-node1/set_fact 44109 1727204227.24082: done queuing things up, now waiting for results queue to drain 44109 1727204227.24083: waiting for pending results... 44109 1727204227.24208: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 44109 1727204227.24214: in run() - task 028d2410-947f-ed67-a560-000000000007 44109 1727204227.24217: variable 'ansible_search_path' from source: unknown 44109 1727204227.24249: calling self._execute() 44109 1727204227.24335: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.24347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.24361: variable 'omit' from source: magic vars 44109 1727204227.24474: variable 'omit' from source: magic vars 44109 1727204227.24511: variable 'omit' from source: magic vars 44109 1727204227.24582: variable 'omit' from source: magic vars 44109 1727204227.24604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204227.24653: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204227.24681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204227.24743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204227.24746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204227.24758: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204227.24767: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.24775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.24882: Set connection var ansible_connection to ssh 44109 1727204227.24894: Set connection var ansible_timeout to 10 44109 1727204227.24904: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204227.24916: Set connection var ansible_pipelining to False 44109 1727204227.24959: Set connection var ansible_shell_executable to /bin/sh 44109 1727204227.24962: Set connection var ansible_shell_type to sh 44109 1727204227.24964: variable 'ansible_shell_executable' from source: unknown 44109 1727204227.24966: variable 'ansible_connection' from source: unknown 44109 1727204227.24968: variable 'ansible_module_compression' from source: unknown 44109 1727204227.24969: variable 'ansible_shell_type' from source: unknown 44109 1727204227.24978: variable 'ansible_shell_executable' from source: unknown 44109 1727204227.24986: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.24994: variable 'ansible_pipelining' from source: unknown 44109 1727204227.24999: variable 'ansible_timeout' from source: unknown 44109 1727204227.25006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.25179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204227.25184: variable 'omit' from source: magic vars 44109 1727204227.25186: starting attempt loop 44109 1727204227.25188: running the handler 44109 1727204227.25191: handler run complete 44109 1727204227.25205: attempt loop complete, returning result 44109 1727204227.25212: _execute() done 44109 1727204227.25219: dumping result to json 44109 1727204227.25226: done dumping result, returning 44109 1727204227.25282: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [028d2410-947f-ed67-a560-000000000007] 44109 1727204227.25285: sending task result for task 028d2410-947f-ed67-a560-000000000007 44109 1727204227.25347: done sending task result for task 028d2410-947f-ed67-a560-000000000007 44109 1727204227.25350: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 44109 1727204227.25440: no more pending results, returning what we have 44109 1727204227.25443: results queue empty 44109 1727204227.25444: checking for any_errors_fatal 44109 1727204227.25448: done checking for any_errors_fatal 44109 1727204227.25449: checking for max_fail_percentage 44109 1727204227.25450: done checking for max_fail_percentage 44109 1727204227.25451: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.25451: done checking to see if all hosts have failed 44109 1727204227.25452: getting the remaining hosts for this loop 44109 1727204227.25453: done getting the remaining hosts for this loop 44109 1727204227.25460: getting the next task for host managed-node1 44109 1727204227.25466: done getting next task for host managed-node1 44109 1727204227.25468: ^ task is: TASK: meta (flush_handlers) 44109 1727204227.25470: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.25474: getting variables 44109 1727204227.25477: in VariableManager get_vars() 44109 1727204227.25508: Calling all_inventory to load vars for managed-node1 44109 1727204227.25511: Calling groups_inventory to load vars for managed-node1 44109 1727204227.25514: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.25523: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.25526: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.25529: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.25678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.25791: done with get_vars() 44109 1727204227.25799: done getting variables 44109 1727204227.25851: in VariableManager get_vars() 44109 1727204227.25857: Calling all_inventory to load vars for managed-node1 44109 1727204227.25859: Calling groups_inventory to load vars for managed-node1 44109 1727204227.25860: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.25863: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.25864: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.25866: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.25950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.26075: done with get_vars() 44109 1727204227.26086: done queuing things up, now waiting for results queue to drain 44109 1727204227.26087: results queue empty 44109 1727204227.26087: checking for any_errors_fatal 44109 1727204227.26089: done checking for any_errors_fatal 44109 1727204227.26089: checking for max_fail_percentage 44109 1727204227.26090: done checking for max_fail_percentage 44109 1727204227.26090: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.26091: done checking to see if all hosts have failed 44109 1727204227.26091: getting the remaining hosts for this loop 44109 1727204227.26092: done getting the remaining hosts for this loop 44109 1727204227.26093: getting the next task for host managed-node1 44109 1727204227.26096: done getting next task for host managed-node1 44109 1727204227.26096: ^ task is: TASK: meta (flush_handlers) 44109 1727204227.26097: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.26103: getting variables 44109 1727204227.26104: in VariableManager get_vars() 44109 1727204227.26108: Calling all_inventory to load vars for managed-node1 44109 1727204227.26110: Calling groups_inventory to load vars for managed-node1 44109 1727204227.26111: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.26115: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.26116: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.26118: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.26197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.26300: done with get_vars() 44109 1727204227.26306: done getting variables 44109 1727204227.26334: in VariableManager get_vars() 44109 1727204227.26339: Calling all_inventory to load vars for managed-node1 44109 1727204227.26340: Calling groups_inventory to load vars for managed-node1 44109 1727204227.26342: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.26346: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.26348: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.26350: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.26428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.26546: done with get_vars() 44109 1727204227.26553: done queuing things up, now waiting for results queue to drain 44109 1727204227.26554: results queue empty 44109 1727204227.26555: checking for any_errors_fatal 44109 1727204227.26555: done checking for any_errors_fatal 44109 1727204227.26556: checking for max_fail_percentage 44109 1727204227.26556: done checking for max_fail_percentage 44109 1727204227.26557: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.26557: done checking to see if all hosts have failed 44109 1727204227.26558: getting the remaining hosts for this loop 44109 1727204227.26558: done getting the remaining hosts for this loop 44109 1727204227.26560: getting the next task for host managed-node1 44109 1727204227.26563: done getting next task for host managed-node1 44109 1727204227.26564: ^ task is: None 44109 1727204227.26565: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.26566: done queuing things up, now waiting for results queue to drain 44109 1727204227.26567: results queue empty 44109 1727204227.26567: checking for any_errors_fatal 44109 1727204227.26568: done checking for any_errors_fatal 44109 1727204227.26568: checking for max_fail_percentage 44109 1727204227.26569: done checking for max_fail_percentage 44109 1727204227.26569: checking to see if all hosts have failed and the running result is not ok 44109 1727204227.26569: done checking to see if all hosts have failed 44109 1727204227.26571: getting the next task for host managed-node1 44109 1727204227.26572: done getting next task for host managed-node1 44109 1727204227.26572: ^ task is: None 44109 1727204227.26573: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.26609: in VariableManager get_vars() 44109 1727204227.26624: done with get_vars() 44109 1727204227.26628: in VariableManager get_vars() 44109 1727204227.26635: done with get_vars() 44109 1727204227.26638: variable 'omit' from source: magic vars 44109 1727204227.26657: in VariableManager get_vars() 44109 1727204227.26664: done with get_vars() 44109 1727204227.26682: variable 'omit' from source: magic vars PLAY [Test for testing routing rules] ****************************************** 44109 1727204227.26854: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44109 1727204227.26874: getting the remaining hosts for this loop 44109 1727204227.26877: done getting the remaining hosts for this loop 44109 1727204227.26879: getting the next task for host managed-node1 44109 1727204227.26881: done getting next task for host managed-node1 44109 1727204227.26882: ^ task is: TASK: Gathering Facts 44109 1727204227.26883: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204227.26884: getting variables 44109 1727204227.26885: in VariableManager get_vars() 44109 1727204227.26894: Calling all_inventory to load vars for managed-node1 44109 1727204227.26896: Calling groups_inventory to load vars for managed-node1 44109 1727204227.26897: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204227.26900: Calling all_plugins_play to load vars for managed-node1 44109 1727204227.26908: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204227.26910: Calling groups_plugins_play to load vars for managed-node1 44109 1727204227.26988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204227.27099: done with get_vars() 44109 1727204227.27107: done getting variables 44109 1727204227.27133: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.035) 0:00:04.067 ***** 44109 1727204227.27149: entering _queue_task() for managed-node1/gather_facts 44109 1727204227.27323: worker is 1 (out of 1 available) 44109 1727204227.27335: exiting _queue_task() for managed-node1/gather_facts 44109 1727204227.27346: done queuing things up, now waiting for results queue to drain 44109 1727204227.27347: waiting for pending results... 44109 1727204227.27590: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44109 1727204227.27607: in run() - task 028d2410-947f-ed67-a560-00000000010b 44109 1727204227.27629: variable 'ansible_search_path' from source: unknown 44109 1727204227.27667: calling self._execute() 44109 1727204227.27745: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.27789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.27803: variable 'omit' from source: magic vars 44109 1727204227.28147: variable 'ansible_distribution_major_version' from source: facts 44109 1727204227.28163: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204227.28173: variable 'omit' from source: magic vars 44109 1727204227.28202: variable 'omit' from source: magic vars 44109 1727204227.28239: variable 'omit' from source: magic vars 44109 1727204227.28381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204227.28385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204227.28387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204227.28390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204227.28392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204227.28411: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204227.28426: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.28431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.28534: Set connection var ansible_connection to ssh 44109 1727204227.28551: Set connection var ansible_timeout to 10 44109 1727204227.28555: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204227.28562: Set connection var ansible_pipelining to False 44109 1727204227.28567: Set connection var ansible_shell_executable to /bin/sh 44109 1727204227.28572: Set connection var ansible_shell_type to sh 44109 1727204227.28594: variable 'ansible_shell_executable' from source: unknown 44109 1727204227.28597: variable 'ansible_connection' from source: unknown 44109 1727204227.28600: variable 'ansible_module_compression' from source: unknown 44109 1727204227.28603: variable 'ansible_shell_type' from source: unknown 44109 1727204227.28605: variable 'ansible_shell_executable' from source: unknown 44109 1727204227.28607: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204227.28613: variable 'ansible_pipelining' from source: unknown 44109 1727204227.28615: variable 'ansible_timeout' from source: unknown 44109 1727204227.28618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204227.28762: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204227.28771: variable 'omit' from source: magic vars 44109 1727204227.28774: starting attempt loop 44109 1727204227.28778: running the handler 44109 1727204227.28795: variable 'ansible_facts' from source: unknown 44109 1727204227.28815: _low_level_execute_command(): starting 44109 1727204227.28818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204227.29319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204227.29324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204227.29327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204227.29383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204227.29389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204227.29391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204227.29477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204227.32023: stdout chunk (state=3): >>>/root <<< 44109 1727204227.32193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204227.32220: stderr chunk (state=3): >>><<< 44109 1727204227.32223: stdout chunk (state=3): >>><<< 44109 1727204227.32244: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204227.32258: _low_level_execute_command(): starting 44109 1727204227.32264: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477 `" && echo ansible-tmp-1727204227.3224375-44451-2009041415477="` echo /root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477 `" ) && sleep 0' 44109 1727204227.32719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204227.32722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204227.32725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44109 1727204227.32734: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204227.32737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204227.32781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204227.32787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204227.32789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204227.32873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204227.35798: stdout chunk (state=3): >>>ansible-tmp-1727204227.3224375-44451-2009041415477=/root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477 <<< 44109 1727204227.35987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204227.36014: stderr chunk (state=3): >>><<< 44109 1727204227.36017: stdout chunk (state=3): >>><<< 44109 1727204227.36033: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204227.3224375-44451-2009041415477=/root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204227.36063: variable 'ansible_module_compression' from source: unknown 44109 1727204227.36103: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44109 1727204227.36150: variable 'ansible_facts' from source: unknown 44109 1727204227.36284: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/AnsiballZ_setup.py 44109 1727204227.36389: Sending initial data 44109 1727204227.36393: Sent initial data (152 bytes) 44109 1727204227.36995: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204227.37021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204227.37038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204227.37101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204227.37164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204227.37203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204227.37273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204227.37389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204227.39861: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204227.39945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204227.40037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpyetsv4vu /root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/AnsiballZ_setup.py <<< 44109 1727204227.40041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/AnsiballZ_setup.py" <<< 44109 1727204227.40122: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpyetsv4vu" to remote "/root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/AnsiballZ_setup.py" <<< 44109 1727204227.41443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204227.41487: stderr chunk (state=3): >>><<< 44109 1727204227.41491: stdout chunk (state=3): >>><<< 44109 1727204227.41507: done transferring module to remote 44109 1727204227.41517: _low_level_execute_command(): starting 44109 1727204227.41521: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/ /root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/AnsiballZ_setup.py && sleep 0' 44109 1727204227.41966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204227.41970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204227.41972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204227.41974: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204227.41988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204227.42042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204227.42046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204227.42051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204227.42151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204227.45017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204227.45074: stderr chunk (state=3): >>><<< 44109 1727204227.45183: stdout chunk (state=3): >>><<< 44109 1727204227.45186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204227.45188: _low_level_execute_command(): starting 44109 1727204227.45190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/AnsiballZ_setup.py && sleep 0' 44109 1727204227.45766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204227.45770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204227.45786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204227.45893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204228.36556: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2919, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 612, "free": 2919}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 819, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785169920, "block_size": 4096, "block_total": 65519099, "block_available": 63912395, "block_used": 1606704, "inode_total": 131070960, "inode_available": 131027258, "inode_used": 43702, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.47021484375, "5m": 0.509765625, "15m": 0.29931640625}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "08", "epoch": "1727204228", "epoch_int": "1727204228", "date": "2024-09-24", "time": "14:57:08", "iso8601_micro": "2024-09-24T18:57:08.359859Z", "iso8601": "2024-09-24T18:57:08Z", "iso8601_basic": "20240924T145708359859", "iso8601_basic_short": "20240924T145708", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44109 1727204228.38909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204228.38914: stdout chunk (state=3): >>><<< 44109 1727204228.38916: stderr chunk (state=3): >>><<< 44109 1727204228.38920: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2919, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 612, "free": 2919}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 819, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785169920, "block_size": 4096, "block_total": 65519099, "block_available": 63912395, "block_used": 1606704, "inode_total": 131070960, "inode_available": 131027258, "inode_used": 43702, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.47021484375, "5m": 0.509765625, "15m": 0.29931640625}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "08", "epoch": "1727204228", "epoch_int": "1727204228", "date": "2024-09-24", "time": "14:57:08", "iso8601_micro": "2024-09-24T18:57:08.359859Z", "iso8601": "2024-09-24T18:57:08Z", "iso8601_basic": "20240924T145708359859", "iso8601_basic_short": "20240924T145708", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204228.39673: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204228.39679: _low_level_execute_command(): starting 44109 1727204228.39682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204227.3224375-44451-2009041415477/ > /dev/null 2>&1 && sleep 0' 44109 1727204228.40709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204228.40935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204228.40947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204228.41013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204228.41190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204228.43117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204228.43173: stderr chunk (state=3): >>><<< 44109 1727204228.43186: stdout chunk (state=3): >>><<< 44109 1727204228.43208: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204228.43222: handler run complete 44109 1727204228.43351: variable 'ansible_facts' from source: unknown 44109 1727204228.43461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.43786: variable 'ansible_facts' from source: unknown 44109 1727204228.43878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.44035: attempt loop complete, returning result 44109 1727204228.44045: _execute() done 44109 1727204228.44053: dumping result to json 44109 1727204228.44136: done dumping result, returning 44109 1727204228.44139: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-ed67-a560-00000000010b] 44109 1727204228.44142: sending task result for task 028d2410-947f-ed67-a560-00000000010b 44109 1727204228.44722: done sending task result for task 028d2410-947f-ed67-a560-00000000010b 44109 1727204228.44725: WORKER PROCESS EXITING ok: [managed-node1] 44109 1727204228.45065: no more pending results, returning what we have 44109 1727204228.45068: results queue empty 44109 1727204228.45069: checking for any_errors_fatal 44109 1727204228.45070: done checking for any_errors_fatal 44109 1727204228.45071: checking for max_fail_percentage 44109 1727204228.45072: done checking for max_fail_percentage 44109 1727204228.45073: checking to see if all hosts have failed and the running result is not ok 44109 1727204228.45074: done checking to see if all hosts have failed 44109 1727204228.45074: getting the remaining hosts for this loop 44109 1727204228.45077: done getting the remaining hosts for this loop 44109 1727204228.45080: getting the next task for host managed-node1 44109 1727204228.45086: done getting next task for host managed-node1 44109 1727204228.45092: ^ task is: TASK: meta (flush_handlers) 44109 1727204228.45094: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204228.45097: getting variables 44109 1727204228.45099: in VariableManager get_vars() 44109 1727204228.45126: Calling all_inventory to load vars for managed-node1 44109 1727204228.45128: Calling groups_inventory to load vars for managed-node1 44109 1727204228.45131: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204228.45141: Calling all_plugins_play to load vars for managed-node1 44109 1727204228.45143: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204228.45146: Calling groups_plugins_play to load vars for managed-node1 44109 1727204228.45339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.45536: done with get_vars() 44109 1727204228.45546: done getting variables 44109 1727204228.45616: in VariableManager get_vars() 44109 1727204228.45628: Calling all_inventory to load vars for managed-node1 44109 1727204228.45630: Calling groups_inventory to load vars for managed-node1 44109 1727204228.45637: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204228.45642: Calling all_plugins_play to load vars for managed-node1 44109 1727204228.45645: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204228.45647: Calling groups_plugins_play to load vars for managed-node1 44109 1727204228.45818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.46007: done with get_vars() 44109 1727204228.46019: done queuing things up, now waiting for results queue to drain 44109 1727204228.46021: results queue empty 44109 1727204228.46021: checking for any_errors_fatal 44109 1727204228.46024: done checking for any_errors_fatal 44109 1727204228.46025: checking for max_fail_percentage 44109 1727204228.46026: done checking for max_fail_percentage 44109 1727204228.46027: checking to see if all hosts have failed and the running result is not ok 44109 1727204228.46028: done checking to see if all hosts have failed 44109 1727204228.46033: getting the remaining hosts for this loop 44109 1727204228.46034: done getting the remaining hosts for this loop 44109 1727204228.46036: getting the next task for host managed-node1 44109 1727204228.46040: done getting next task for host managed-node1 44109 1727204228.46042: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 44109 1727204228.46044: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204228.46045: getting variables 44109 1727204228.46046: in VariableManager get_vars() 44109 1727204228.46056: Calling all_inventory to load vars for managed-node1 44109 1727204228.46058: Calling groups_inventory to load vars for managed-node1 44109 1727204228.46060: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204228.46064: Calling all_plugins_play to load vars for managed-node1 44109 1727204228.46066: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204228.46074: Calling groups_plugins_play to load vars for managed-node1 44109 1727204228.46203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.46413: done with get_vars() 44109 1727204228.46421: done getting variables 44109 1727204228.46459: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204228.46597: variable 'type' from source: play vars 44109 1727204228.46602: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:10 Tuesday 24 September 2024 14:57:08 -0400 (0:00:01.194) 0:00:05.262 ***** 44109 1727204228.46643: entering _queue_task() for managed-node1/set_fact 44109 1727204228.46923: worker is 1 (out of 1 available) 44109 1727204228.46935: exiting _queue_task() for managed-node1/set_fact 44109 1727204228.47060: done queuing things up, now waiting for results queue to drain 44109 1727204228.47061: waiting for pending results... 44109 1727204228.47294: running TaskExecutor() for managed-node1/TASK: Set type=veth and interface=ethtest0 44109 1727204228.47299: in run() - task 028d2410-947f-ed67-a560-00000000000b 44109 1727204228.47302: variable 'ansible_search_path' from source: unknown 44109 1727204228.47322: calling self._execute() 44109 1727204228.47402: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204228.47415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204228.47431: variable 'omit' from source: magic vars 44109 1727204228.47786: variable 'ansible_distribution_major_version' from source: facts 44109 1727204228.47805: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204228.47819: variable 'omit' from source: magic vars 44109 1727204228.47841: variable 'omit' from source: magic vars 44109 1727204228.47874: variable 'type' from source: play vars 44109 1727204228.48081: variable 'type' from source: play vars 44109 1727204228.48084: variable 'interface' from source: play vars 44109 1727204228.48087: variable 'interface' from source: play vars 44109 1727204228.48089: variable 'omit' from source: magic vars 44109 1727204228.48091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204228.48132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204228.48157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204228.48181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204228.48198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204228.48234: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204228.48244: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204228.48252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204228.48360: Set connection var ansible_connection to ssh 44109 1727204228.48372: Set connection var ansible_timeout to 10 44109 1727204228.48385: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204228.48397: Set connection var ansible_pipelining to False 44109 1727204228.48406: Set connection var ansible_shell_executable to /bin/sh 44109 1727204228.48420: Set connection var ansible_shell_type to sh 44109 1727204228.48445: variable 'ansible_shell_executable' from source: unknown 44109 1727204228.48453: variable 'ansible_connection' from source: unknown 44109 1727204228.48460: variable 'ansible_module_compression' from source: unknown 44109 1727204228.48467: variable 'ansible_shell_type' from source: unknown 44109 1727204228.48474: variable 'ansible_shell_executable' from source: unknown 44109 1727204228.48484: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204228.48492: variable 'ansible_pipelining' from source: unknown 44109 1727204228.48499: variable 'ansible_timeout' from source: unknown 44109 1727204228.48507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204228.48648: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204228.48664: variable 'omit' from source: magic vars 44109 1727204228.48673: starting attempt loop 44109 1727204228.48682: running the handler 44109 1727204228.48780: handler run complete 44109 1727204228.48783: attempt loop complete, returning result 44109 1727204228.48785: _execute() done 44109 1727204228.48787: dumping result to json 44109 1727204228.48789: done dumping result, returning 44109 1727204228.48792: done running TaskExecutor() for managed-node1/TASK: Set type=veth and interface=ethtest0 [028d2410-947f-ed67-a560-00000000000b] 44109 1727204228.48794: sending task result for task 028d2410-947f-ed67-a560-00000000000b 44109 1727204228.48860: done sending task result for task 028d2410-947f-ed67-a560-00000000000b 44109 1727204228.48863: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 44109 1727204228.48923: no more pending results, returning what we have 44109 1727204228.48925: results queue empty 44109 1727204228.48926: checking for any_errors_fatal 44109 1727204228.48929: done checking for any_errors_fatal 44109 1727204228.48930: checking for max_fail_percentage 44109 1727204228.48932: done checking for max_fail_percentage 44109 1727204228.48932: checking to see if all hosts have failed and the running result is not ok 44109 1727204228.48933: done checking to see if all hosts have failed 44109 1727204228.48934: getting the remaining hosts for this loop 44109 1727204228.48935: done getting the remaining hosts for this loop 44109 1727204228.48938: getting the next task for host managed-node1 44109 1727204228.48943: done getting next task for host managed-node1 44109 1727204228.48945: ^ task is: TASK: Include the task 'show_interfaces.yml' 44109 1727204228.48947: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204228.48951: getting variables 44109 1727204228.48952: in VariableManager get_vars() 44109 1727204228.48988: Calling all_inventory to load vars for managed-node1 44109 1727204228.48991: Calling groups_inventory to load vars for managed-node1 44109 1727204228.48993: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204228.49002: Calling all_plugins_play to load vars for managed-node1 44109 1727204228.49004: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204228.49007: Calling groups_plugins_play to load vars for managed-node1 44109 1727204228.49208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.49415: done with get_vars() 44109 1727204228.49426: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:14 Tuesday 24 September 2024 14:57:08 -0400 (0:00:00.028) 0:00:05.291 ***** 44109 1727204228.49515: entering _queue_task() for managed-node1/include_tasks 44109 1727204228.49771: worker is 1 (out of 1 available) 44109 1727204228.49787: exiting _queue_task() for managed-node1/include_tasks 44109 1727204228.49798: done queuing things up, now waiting for results queue to drain 44109 1727204228.49799: waiting for pending results... 44109 1727204228.50205: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 44109 1727204228.50210: in run() - task 028d2410-947f-ed67-a560-00000000000c 44109 1727204228.50214: variable 'ansible_search_path' from source: unknown 44109 1727204228.50219: calling self._execute() 44109 1727204228.50315: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204228.50327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204228.50343: variable 'omit' from source: magic vars 44109 1727204228.50698: variable 'ansible_distribution_major_version' from source: facts 44109 1727204228.50717: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204228.50732: _execute() done 44109 1727204228.50740: dumping result to json 44109 1727204228.50748: done dumping result, returning 44109 1727204228.50766: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-ed67-a560-00000000000c] 44109 1727204228.51001: sending task result for task 028d2410-947f-ed67-a560-00000000000c 44109 1727204228.51160: no more pending results, returning what we have 44109 1727204228.51166: in VariableManager get_vars() 44109 1727204228.51210: Calling all_inventory to load vars for managed-node1 44109 1727204228.51213: Calling groups_inventory to load vars for managed-node1 44109 1727204228.51215: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204228.51227: Calling all_plugins_play to load vars for managed-node1 44109 1727204228.51230: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204228.51233: Calling groups_plugins_play to load vars for managed-node1 44109 1727204228.51965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.52452: done with get_vars() 44109 1727204228.52458: variable 'ansible_search_path' from source: unknown 44109 1727204228.52471: done sending task result for task 028d2410-947f-ed67-a560-00000000000c 44109 1727204228.52474: WORKER PROCESS EXITING 44109 1727204228.52483: we have included files to process 44109 1727204228.52484: generating all_blocks data 44109 1727204228.52485: done generating all_blocks data 44109 1727204228.52486: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44109 1727204228.52487: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44109 1727204228.52490: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44109 1727204228.52929: in VariableManager get_vars() 44109 1727204228.52948: done with get_vars() 44109 1727204228.53059: done processing included file 44109 1727204228.53061: iterating over new_blocks loaded from include file 44109 1727204228.53063: in VariableManager get_vars() 44109 1727204228.53137: done with get_vars() 44109 1727204228.53139: filtering new block on tags 44109 1727204228.53156: done filtering new block on tags 44109 1727204228.53158: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 44109 1727204228.53163: extending task lists for all hosts with included blocks 44109 1727204228.55764: done extending task lists 44109 1727204228.55766: done processing included files 44109 1727204228.55767: results queue empty 44109 1727204228.55767: checking for any_errors_fatal 44109 1727204228.55771: done checking for any_errors_fatal 44109 1727204228.55772: checking for max_fail_percentage 44109 1727204228.55773: done checking for max_fail_percentage 44109 1727204228.55774: checking to see if all hosts have failed and the running result is not ok 44109 1727204228.55774: done checking to see if all hosts have failed 44109 1727204228.55777: getting the remaining hosts for this loop 44109 1727204228.55778: done getting the remaining hosts for this loop 44109 1727204228.55780: getting the next task for host managed-node1 44109 1727204228.55785: done getting next task for host managed-node1 44109 1727204228.55787: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44109 1727204228.55789: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204228.55791: getting variables 44109 1727204228.55792: in VariableManager get_vars() 44109 1727204228.55806: Calling all_inventory to load vars for managed-node1 44109 1727204228.55808: Calling groups_inventory to load vars for managed-node1 44109 1727204228.55810: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204228.55816: Calling all_plugins_play to load vars for managed-node1 44109 1727204228.55818: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204228.55822: Calling groups_plugins_play to load vars for managed-node1 44109 1727204228.56019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.56279: done with get_vars() 44109 1727204228.56289: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:57:08 -0400 (0:00:00.068) 0:00:05.359 ***** 44109 1727204228.56360: entering _queue_task() for managed-node1/include_tasks 44109 1727204228.56881: worker is 1 (out of 1 available) 44109 1727204228.56892: exiting _queue_task() for managed-node1/include_tasks 44109 1727204228.56902: done queuing things up, now waiting for results queue to drain 44109 1727204228.56903: waiting for pending results... 44109 1727204228.57189: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 44109 1727204228.57313: in run() - task 028d2410-947f-ed67-a560-000000000121 44109 1727204228.57335: variable 'ansible_search_path' from source: unknown 44109 1727204228.57343: variable 'ansible_search_path' from source: unknown 44109 1727204228.57385: calling self._execute() 44109 1727204228.57466: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204228.57481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204228.57496: variable 'omit' from source: magic vars 44109 1727204228.57882: variable 'ansible_distribution_major_version' from source: facts 44109 1727204228.57898: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204228.57909: _execute() done 44109 1727204228.57917: dumping result to json 44109 1727204228.57923: done dumping result, returning 44109 1727204228.57931: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-ed67-a560-000000000121] 44109 1727204228.57940: sending task result for task 028d2410-947f-ed67-a560-000000000121 44109 1727204228.58040: done sending task result for task 028d2410-947f-ed67-a560-000000000121 44109 1727204228.58047: WORKER PROCESS EXITING 44109 1727204228.58090: no more pending results, returning what we have 44109 1727204228.58096: in VariableManager get_vars() 44109 1727204228.58137: Calling all_inventory to load vars for managed-node1 44109 1727204228.58139: Calling groups_inventory to load vars for managed-node1 44109 1727204228.58142: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204228.58154: Calling all_plugins_play to load vars for managed-node1 44109 1727204228.58157: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204228.58160: Calling groups_plugins_play to load vars for managed-node1 44109 1727204228.58571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.58781: done with get_vars() 44109 1727204228.58789: variable 'ansible_search_path' from source: unknown 44109 1727204228.58790: variable 'ansible_search_path' from source: unknown 44109 1727204228.58825: we have included files to process 44109 1727204228.58827: generating all_blocks data 44109 1727204228.58828: done generating all_blocks data 44109 1727204228.58829: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44109 1727204228.58830: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44109 1727204228.58832: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44109 1727204228.59142: done processing included file 44109 1727204228.59145: iterating over new_blocks loaded from include file 44109 1727204228.59146: in VariableManager get_vars() 44109 1727204228.59162: done with get_vars() 44109 1727204228.59164: filtering new block on tags 44109 1727204228.59183: done filtering new block on tags 44109 1727204228.59185: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 44109 1727204228.59190: extending task lists for all hosts with included blocks 44109 1727204228.59291: done extending task lists 44109 1727204228.59292: done processing included files 44109 1727204228.59293: results queue empty 44109 1727204228.59294: checking for any_errors_fatal 44109 1727204228.59296: done checking for any_errors_fatal 44109 1727204228.59297: checking for max_fail_percentage 44109 1727204228.59298: done checking for max_fail_percentage 44109 1727204228.59298: checking to see if all hosts have failed and the running result is not ok 44109 1727204228.59299: done checking to see if all hosts have failed 44109 1727204228.59300: getting the remaining hosts for this loop 44109 1727204228.59301: done getting the remaining hosts for this loop 44109 1727204228.59303: getting the next task for host managed-node1 44109 1727204228.59313: done getting next task for host managed-node1 44109 1727204228.59315: ^ task is: TASK: Gather current interface info 44109 1727204228.59318: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204228.59321: getting variables 44109 1727204228.59322: in VariableManager get_vars() 44109 1727204228.59332: Calling all_inventory to load vars for managed-node1 44109 1727204228.59334: Calling groups_inventory to load vars for managed-node1 44109 1727204228.59335: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204228.59342: Calling all_plugins_play to load vars for managed-node1 44109 1727204228.59344: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204228.59347: Calling groups_plugins_play to load vars for managed-node1 44109 1727204228.59483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204228.59664: done with get_vars() 44109 1727204228.59672: done getting variables 44109 1727204228.59712: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:57:08 -0400 (0:00:00.033) 0:00:05.393 ***** 44109 1727204228.59739: entering _queue_task() for managed-node1/command 44109 1727204228.59999: worker is 1 (out of 1 available) 44109 1727204228.60011: exiting _queue_task() for managed-node1/command 44109 1727204228.60022: done queuing things up, now waiting for results queue to drain 44109 1727204228.60023: waiting for pending results... 44109 1727204228.60265: running TaskExecutor() for managed-node1/TASK: Gather current interface info 44109 1727204228.60382: in run() - task 028d2410-947f-ed67-a560-0000000001b0 44109 1727204228.60468: variable 'ansible_search_path' from source: unknown 44109 1727204228.60471: variable 'ansible_search_path' from source: unknown 44109 1727204228.60473: calling self._execute() 44109 1727204228.60510: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204228.60523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204228.60538: variable 'omit' from source: magic vars 44109 1727204228.60942: variable 'ansible_distribution_major_version' from source: facts 44109 1727204228.60958: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204228.60971: variable 'omit' from source: magic vars 44109 1727204228.61023: variable 'omit' from source: magic vars 44109 1727204228.61062: variable 'omit' from source: magic vars 44109 1727204228.61132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204228.61153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204228.61180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204228.61240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204228.61243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204228.61255: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204228.61263: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204228.61271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204228.61377: Set connection var ansible_connection to ssh 44109 1727204228.61395: Set connection var ansible_timeout to 10 44109 1727204228.61405: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204228.61457: Set connection var ansible_pipelining to False 44109 1727204228.61460: Set connection var ansible_shell_executable to /bin/sh 44109 1727204228.61463: Set connection var ansible_shell_type to sh 44109 1727204228.61466: variable 'ansible_shell_executable' from source: unknown 44109 1727204228.61467: variable 'ansible_connection' from source: unknown 44109 1727204228.61473: variable 'ansible_module_compression' from source: unknown 44109 1727204228.61486: variable 'ansible_shell_type' from source: unknown 44109 1727204228.61494: variable 'ansible_shell_executable' from source: unknown 44109 1727204228.61500: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204228.61508: variable 'ansible_pipelining' from source: unknown 44109 1727204228.61514: variable 'ansible_timeout' from source: unknown 44109 1727204228.61680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204228.61684: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204228.61687: variable 'omit' from source: magic vars 44109 1727204228.61689: starting attempt loop 44109 1727204228.61691: running the handler 44109 1727204228.61696: _low_level_execute_command(): starting 44109 1727204228.61707: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204228.62438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204228.62463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204228.62482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204228.62585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204228.62615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204228.62633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204228.62657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204228.62789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204228.64553: stdout chunk (state=3): >>>/root <<< 44109 1727204228.64691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204228.64718: stderr chunk (state=3): >>><<< 44109 1727204228.64736: stdout chunk (state=3): >>><<< 44109 1727204228.64766: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204228.64807: _low_level_execute_command(): starting 44109 1727204228.64812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600 `" && echo ansible-tmp-1727204228.6477246-44640-156058328550600="` echo /root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600 `" ) && sleep 0' 44109 1727204228.65430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204228.65445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204228.65460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204228.65489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204228.65545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204228.65605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204228.65638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204228.65660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204228.65777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204228.67850: stdout chunk (state=3): >>>ansible-tmp-1727204228.6477246-44640-156058328550600=/root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600 <<< 44109 1727204228.67995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204228.68026: stderr chunk (state=3): >>><<< 44109 1727204228.68046: stdout chunk (state=3): >>><<< 44109 1727204228.68063: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204228.6477246-44640-156058328550600=/root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204228.68281: variable 'ansible_module_compression' from source: unknown 44109 1727204228.68285: ANSIBALLZ: Using generic lock for ansible.legacy.command 44109 1727204228.68287: ANSIBALLZ: Acquiring lock 44109 1727204228.68289: ANSIBALLZ: Lock acquired: 139907468546112 44109 1727204228.68291: ANSIBALLZ: Creating module 44109 1727204228.87841: ANSIBALLZ: Writing module into payload 44109 1727204228.87901: ANSIBALLZ: Writing module 44109 1727204228.87970: ANSIBALLZ: Renaming module 44109 1727204228.87985: ANSIBALLZ: Done creating module 44109 1727204228.88078: variable 'ansible_facts' from source: unknown 44109 1727204228.88161: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/AnsiballZ_command.py 44109 1727204228.88623: Sending initial data 44109 1727204228.88626: Sent initial data (156 bytes) 44109 1727204228.89825: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204228.90014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204228.90134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204228.90250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204228.92011: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 44109 1727204228.92016: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204228.92087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204228.92182: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpqq0u538a /root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/AnsiballZ_command.py <<< 44109 1727204228.92185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/AnsiballZ_command.py" <<< 44109 1727204228.92285: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpqq0u538a" to remote "/root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/AnsiballZ_command.py" <<< 44109 1727204228.93174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204228.93193: stdout chunk (state=3): >>><<< 44109 1727204228.93341: stderr chunk (state=3): >>><<< 44109 1727204228.93344: done transferring module to remote 44109 1727204228.93346: _low_level_execute_command(): starting 44109 1727204228.93348: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/ /root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/AnsiballZ_command.py && sleep 0' 44109 1727204228.93912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204228.93993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204228.94033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204228.94057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204228.94071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204228.94188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204228.96206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204228.96232: stderr chunk (state=3): >>><<< 44109 1727204228.96263: stdout chunk (state=3): >>><<< 44109 1727204228.96294: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204228.96297: _low_level_execute_command(): starting 44109 1727204228.96300: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/AnsiballZ_command.py && sleep 0' 44109 1727204228.96750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204228.96754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204228.96756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204228.96758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204228.96760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204228.96808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204228.96815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204228.96901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204229.13551: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:09.129065", "end": "2024-09-24 14:57:09.132681", "delta": "0:00:00.003616", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204229.15418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204229.15423: stdout chunk (state=3): >>><<< 44109 1727204229.15425: stderr chunk (state=3): >>><<< 44109 1727204229.15442: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:09.129065", "end": "2024-09-24 14:57:09.132681", "delta": "0:00:00.003616", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204229.15481: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204229.15491: _low_level_execute_command(): starting 44109 1727204229.15497: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204228.6477246-44640-156058328550600/ > /dev/null 2>&1 && sleep 0' 44109 1727204229.17185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204229.17189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204229.17191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204229.17207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204229.17292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204229.17436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204229.19496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204229.19500: stdout chunk (state=3): >>><<< 44109 1727204229.19504: stderr chunk (state=3): >>><<< 44109 1727204229.19507: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204229.19510: handler run complete 44109 1727204229.19512: Evaluated conditional (False): False 44109 1727204229.19514: attempt loop complete, returning result 44109 1727204229.19516: _execute() done 44109 1727204229.19517: dumping result to json 44109 1727204229.19519: done dumping result, returning 44109 1727204229.19521: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [028d2410-947f-ed67-a560-0000000001b0] 44109 1727204229.19526: sending task result for task 028d2410-947f-ed67-a560-0000000001b0 44109 1727204229.19982: done sending task result for task 028d2410-947f-ed67-a560-0000000001b0 44109 1727204229.19986: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003616", "end": "2024-09-24 14:57:09.132681", "rc": 0, "start": "2024-09-24 14:57:09.129065" } STDOUT: bonding_masters eth0 lo 44109 1727204229.20071: no more pending results, returning what we have 44109 1727204229.20077: results queue empty 44109 1727204229.20078: checking for any_errors_fatal 44109 1727204229.20080: done checking for any_errors_fatal 44109 1727204229.20081: checking for max_fail_percentage 44109 1727204229.20083: done checking for max_fail_percentage 44109 1727204229.20084: checking to see if all hosts have failed and the running result is not ok 44109 1727204229.20085: done checking to see if all hosts have failed 44109 1727204229.20085: getting the remaining hosts for this loop 44109 1727204229.20087: done getting the remaining hosts for this loop 44109 1727204229.20090: getting the next task for host managed-node1 44109 1727204229.20098: done getting next task for host managed-node1 44109 1727204229.20100: ^ task is: TASK: Set current_interfaces 44109 1727204229.20104: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204229.20108: getting variables 44109 1727204229.20440: in VariableManager get_vars() 44109 1727204229.20474: Calling all_inventory to load vars for managed-node1 44109 1727204229.20681: Calling groups_inventory to load vars for managed-node1 44109 1727204229.20685: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.20695: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.20698: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.20701: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.21353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.21643: done with get_vars() 44109 1727204229.21656: done getting variables 44109 1727204229.21720: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:57:09 -0400 (0:00:00.622) 0:00:06.015 ***** 44109 1727204229.21964: entering _queue_task() for managed-node1/set_fact 44109 1727204229.22470: worker is 1 (out of 1 available) 44109 1727204229.22585: exiting _queue_task() for managed-node1/set_fact 44109 1727204229.22597: done queuing things up, now waiting for results queue to drain 44109 1727204229.22598: waiting for pending results... 44109 1727204229.23096: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 44109 1727204229.23507: in run() - task 028d2410-947f-ed67-a560-0000000001b1 44109 1727204229.23514: variable 'ansible_search_path' from source: unknown 44109 1727204229.23518: variable 'ansible_search_path' from source: unknown 44109 1727204229.23565: calling self._execute() 44109 1727204229.23983: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.23987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.23989: variable 'omit' from source: magic vars 44109 1727204229.25059: variable 'ansible_distribution_major_version' from source: facts 44109 1727204229.25284: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204229.25288: variable 'omit' from source: magic vars 44109 1727204229.25290: variable 'omit' from source: magic vars 44109 1727204229.25632: variable '_current_interfaces' from source: set_fact 44109 1727204229.25683: variable 'omit' from source: magic vars 44109 1727204229.25782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204229.26001: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204229.26050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204229.26124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204229.26228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204229.26482: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204229.26485: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.26488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.26625: Set connection var ansible_connection to ssh 44109 1727204229.26689: Set connection var ansible_timeout to 10 44109 1727204229.26699: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204229.26767: Set connection var ansible_pipelining to False 44109 1727204229.26806: Set connection var ansible_shell_executable to /bin/sh 44109 1727204229.26821: Set connection var ansible_shell_type to sh 44109 1727204229.26880: variable 'ansible_shell_executable' from source: unknown 44109 1727204229.26931: variable 'ansible_connection' from source: unknown 44109 1727204229.26939: variable 'ansible_module_compression' from source: unknown 44109 1727204229.26945: variable 'ansible_shell_type' from source: unknown 44109 1727204229.26972: variable 'ansible_shell_executable' from source: unknown 44109 1727204229.26979: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.26982: variable 'ansible_pipelining' from source: unknown 44109 1727204229.26984: variable 'ansible_timeout' from source: unknown 44109 1727204229.26986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.27253: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204229.27391: variable 'omit' from source: magic vars 44109 1727204229.27402: starting attempt loop 44109 1727204229.27613: running the handler 44109 1727204229.27616: handler run complete 44109 1727204229.27619: attempt loop complete, returning result 44109 1727204229.27621: _execute() done 44109 1727204229.27623: dumping result to json 44109 1727204229.27624: done dumping result, returning 44109 1727204229.27627: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [028d2410-947f-ed67-a560-0000000001b1] 44109 1727204229.27629: sending task result for task 028d2410-947f-ed67-a560-0000000001b1 44109 1727204229.27882: done sending task result for task 028d2410-947f-ed67-a560-0000000001b1 44109 1727204229.27886: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 44109 1727204229.27953: no more pending results, returning what we have 44109 1727204229.27956: results queue empty 44109 1727204229.27957: checking for any_errors_fatal 44109 1727204229.27965: done checking for any_errors_fatal 44109 1727204229.27966: checking for max_fail_percentage 44109 1727204229.27968: done checking for max_fail_percentage 44109 1727204229.27969: checking to see if all hosts have failed and the running result is not ok 44109 1727204229.27970: done checking to see if all hosts have failed 44109 1727204229.27970: getting the remaining hosts for this loop 44109 1727204229.27972: done getting the remaining hosts for this loop 44109 1727204229.27978: getting the next task for host managed-node1 44109 1727204229.27986: done getting next task for host managed-node1 44109 1727204229.27989: ^ task is: TASK: Show current_interfaces 44109 1727204229.27993: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204229.27997: getting variables 44109 1727204229.27999: in VariableManager get_vars() 44109 1727204229.28041: Calling all_inventory to load vars for managed-node1 44109 1727204229.28044: Calling groups_inventory to load vars for managed-node1 44109 1727204229.28047: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.28058: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.28060: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.28063: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.28665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.29650: done with get_vars() 44109 1727204229.29659: done getting variables 44109 1727204229.29872: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:57:09 -0400 (0:00:00.080) 0:00:06.096 ***** 44109 1727204229.30028: entering _queue_task() for managed-node1/debug 44109 1727204229.30029: Creating lock for debug 44109 1727204229.30624: worker is 1 (out of 1 available) 44109 1727204229.30634: exiting _queue_task() for managed-node1/debug 44109 1727204229.30644: done queuing things up, now waiting for results queue to drain 44109 1727204229.30645: waiting for pending results... 44109 1727204229.31033: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 44109 1727204229.31355: in run() - task 028d2410-947f-ed67-a560-000000000122 44109 1727204229.31591: variable 'ansible_search_path' from source: unknown 44109 1727204229.31595: variable 'ansible_search_path' from source: unknown 44109 1727204229.31629: calling self._execute() 44109 1727204229.31816: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.31820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.31828: variable 'omit' from source: magic vars 44109 1727204229.32569: variable 'ansible_distribution_major_version' from source: facts 44109 1727204229.32693: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204229.32699: variable 'omit' from source: magic vars 44109 1727204229.32737: variable 'omit' from source: magic vars 44109 1727204229.33147: variable 'current_interfaces' from source: set_fact 44109 1727204229.33174: variable 'omit' from source: magic vars 44109 1727204229.33301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204229.33452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204229.33471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204229.33546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204229.33552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204229.33589: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204229.33592: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.33595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.33925: Set connection var ansible_connection to ssh 44109 1727204229.33985: Set connection var ansible_timeout to 10 44109 1727204229.33989: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204229.33991: Set connection var ansible_pipelining to False 44109 1727204229.33994: Set connection var ansible_shell_executable to /bin/sh 44109 1727204229.33996: Set connection var ansible_shell_type to sh 44109 1727204229.34259: variable 'ansible_shell_executable' from source: unknown 44109 1727204229.34264: variable 'ansible_connection' from source: unknown 44109 1727204229.34267: variable 'ansible_module_compression' from source: unknown 44109 1727204229.34270: variable 'ansible_shell_type' from source: unknown 44109 1727204229.34274: variable 'ansible_shell_executable' from source: unknown 44109 1727204229.34278: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.34280: variable 'ansible_pipelining' from source: unknown 44109 1727204229.34283: variable 'ansible_timeout' from source: unknown 44109 1727204229.34422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.34604: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204229.34782: variable 'omit' from source: magic vars 44109 1727204229.34786: starting attempt loop 44109 1727204229.34788: running the handler 44109 1727204229.34882: handler run complete 44109 1727204229.34885: attempt loop complete, returning result 44109 1727204229.34888: _execute() done 44109 1727204229.34889: dumping result to json 44109 1727204229.34891: done dumping result, returning 44109 1727204229.34894: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [028d2410-947f-ed67-a560-000000000122] 44109 1727204229.34895: sending task result for task 028d2410-947f-ed67-a560-000000000122 44109 1727204229.34964: done sending task result for task 028d2410-947f-ed67-a560-000000000122 44109 1727204229.34967: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 44109 1727204229.35022: no more pending results, returning what we have 44109 1727204229.35026: results queue empty 44109 1727204229.35026: checking for any_errors_fatal 44109 1727204229.35033: done checking for any_errors_fatal 44109 1727204229.35034: checking for max_fail_percentage 44109 1727204229.35036: done checking for max_fail_percentage 44109 1727204229.35036: checking to see if all hosts have failed and the running result is not ok 44109 1727204229.35037: done checking to see if all hosts have failed 44109 1727204229.35038: getting the remaining hosts for this loop 44109 1727204229.35039: done getting the remaining hosts for this loop 44109 1727204229.35043: getting the next task for host managed-node1 44109 1727204229.35051: done getting next task for host managed-node1 44109 1727204229.35054: ^ task is: TASK: Include the task 'manage_test_interface.yml' 44109 1727204229.35056: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204229.35061: getting variables 44109 1727204229.35063: in VariableManager get_vars() 44109 1727204229.35210: Calling all_inventory to load vars for managed-node1 44109 1727204229.35216: Calling groups_inventory to load vars for managed-node1 44109 1727204229.35219: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.35229: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.35232: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.35234: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.35717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.36345: done with get_vars() 44109 1727204229.36358: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:16 Tuesday 24 September 2024 14:57:09 -0400 (0:00:00.064) 0:00:06.161 ***** 44109 1727204229.36507: entering _queue_task() for managed-node1/include_tasks 44109 1727204229.37136: worker is 1 (out of 1 available) 44109 1727204229.37149: exiting _queue_task() for managed-node1/include_tasks 44109 1727204229.37161: done queuing things up, now waiting for results queue to drain 44109 1727204229.37162: waiting for pending results... 44109 1727204229.37542: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 44109 1727204229.37627: in run() - task 028d2410-947f-ed67-a560-00000000000d 44109 1727204229.37641: variable 'ansible_search_path' from source: unknown 44109 1727204229.37823: calling self._execute() 44109 1727204229.37871: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.37989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.37999: variable 'omit' from source: magic vars 44109 1727204229.38782: variable 'ansible_distribution_major_version' from source: facts 44109 1727204229.38786: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204229.38788: _execute() done 44109 1727204229.38791: dumping result to json 44109 1727204229.38796: done dumping result, returning 44109 1727204229.38841: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [028d2410-947f-ed67-a560-00000000000d] 44109 1727204229.38843: sending task result for task 028d2410-947f-ed67-a560-00000000000d 44109 1727204229.38917: done sending task result for task 028d2410-947f-ed67-a560-00000000000d 44109 1727204229.38920: WORKER PROCESS EXITING 44109 1727204229.38954: no more pending results, returning what we have 44109 1727204229.38960: in VariableManager get_vars() 44109 1727204229.39008: Calling all_inventory to load vars for managed-node1 44109 1727204229.39015: Calling groups_inventory to load vars for managed-node1 44109 1727204229.39017: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.39030: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.39033: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.39035: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.39785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.39973: done with get_vars() 44109 1727204229.39983: variable 'ansible_search_path' from source: unknown 44109 1727204229.40090: we have included files to process 44109 1727204229.40092: generating all_blocks data 44109 1727204229.40098: done generating all_blocks data 44109 1727204229.40102: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 44109 1727204229.40103: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 44109 1727204229.40106: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 44109 1727204229.41144: in VariableManager get_vars() 44109 1727204229.41165: done with get_vars() 44109 1727204229.41590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44109 1727204229.42774: done processing included file 44109 1727204229.42778: iterating over new_blocks loaded from include file 44109 1727204229.42779: in VariableManager get_vars() 44109 1727204229.42793: done with get_vars() 44109 1727204229.42794: filtering new block on tags 44109 1727204229.42940: done filtering new block on tags 44109 1727204229.42943: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 44109 1727204229.42948: extending task lists for all hosts with included blocks 44109 1727204229.45594: done extending task lists 44109 1727204229.45596: done processing included files 44109 1727204229.45597: results queue empty 44109 1727204229.45598: checking for any_errors_fatal 44109 1727204229.45601: done checking for any_errors_fatal 44109 1727204229.45602: checking for max_fail_percentage 44109 1727204229.45603: done checking for max_fail_percentage 44109 1727204229.45603: checking to see if all hosts have failed and the running result is not ok 44109 1727204229.45604: done checking to see if all hosts have failed 44109 1727204229.45605: getting the remaining hosts for this loop 44109 1727204229.45606: done getting the remaining hosts for this loop 44109 1727204229.45608: getting the next task for host managed-node1 44109 1727204229.45615: done getting next task for host managed-node1 44109 1727204229.45617: ^ task is: TASK: Ensure state in ["present", "absent"] 44109 1727204229.45620: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204229.45622: getting variables 44109 1727204229.45623: in VariableManager get_vars() 44109 1727204229.45636: Calling all_inventory to load vars for managed-node1 44109 1727204229.45638: Calling groups_inventory to load vars for managed-node1 44109 1727204229.45640: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.45646: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.45785: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.45789: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.46058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.46546: done with get_vars() 44109 1727204229.46556: done getting variables 44109 1727204229.46698: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:57:09 -0400 (0:00:00.102) 0:00:06.263 ***** 44109 1727204229.46727: entering _queue_task() for managed-node1/fail 44109 1727204229.46728: Creating lock for fail 44109 1727204229.47507: worker is 1 (out of 1 available) 44109 1727204229.47524: exiting _queue_task() for managed-node1/fail 44109 1727204229.47534: done queuing things up, now waiting for results queue to drain 44109 1727204229.47535: waiting for pending results... 44109 1727204229.48082: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 44109 1727204229.48091: in run() - task 028d2410-947f-ed67-a560-0000000001cc 44109 1727204229.48117: variable 'ansible_search_path' from source: unknown 44109 1727204229.48125: variable 'ansible_search_path' from source: unknown 44109 1727204229.48165: calling self._execute() 44109 1727204229.48266: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.48325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.48330: variable 'omit' from source: magic vars 44109 1727204229.48688: variable 'ansible_distribution_major_version' from source: facts 44109 1727204229.48706: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204229.48842: variable 'state' from source: include params 44109 1727204229.48853: Evaluated conditional (state not in ["present", "absent"]): False 44109 1727204229.48860: when evaluation is False, skipping this task 44109 1727204229.48874: _execute() done 44109 1727204229.48981: dumping result to json 44109 1727204229.48984: done dumping result, returning 44109 1727204229.48987: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [028d2410-947f-ed67-a560-0000000001cc] 44109 1727204229.48990: sending task result for task 028d2410-947f-ed67-a560-0000000001cc 44109 1727204229.49056: done sending task result for task 028d2410-947f-ed67-a560-0000000001cc 44109 1727204229.49060: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 44109 1727204229.49108: no more pending results, returning what we have 44109 1727204229.49114: results queue empty 44109 1727204229.49115: checking for any_errors_fatal 44109 1727204229.49117: done checking for any_errors_fatal 44109 1727204229.49118: checking for max_fail_percentage 44109 1727204229.49119: done checking for max_fail_percentage 44109 1727204229.49120: checking to see if all hosts have failed and the running result is not ok 44109 1727204229.49121: done checking to see if all hosts have failed 44109 1727204229.49121: getting the remaining hosts for this loop 44109 1727204229.49123: done getting the remaining hosts for this loop 44109 1727204229.49126: getting the next task for host managed-node1 44109 1727204229.49132: done getting next task for host managed-node1 44109 1727204229.49134: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 44109 1727204229.49137: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204229.49140: getting variables 44109 1727204229.49142: in VariableManager get_vars() 44109 1727204229.49183: Calling all_inventory to load vars for managed-node1 44109 1727204229.49187: Calling groups_inventory to load vars for managed-node1 44109 1727204229.49189: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.49200: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.49203: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.49205: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.49458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.49668: done with get_vars() 44109 1727204229.49680: done getting variables 44109 1727204229.49842: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:57:09 -0400 (0:00:00.031) 0:00:06.294 ***** 44109 1727204229.49870: entering _queue_task() for managed-node1/fail 44109 1727204229.50537: worker is 1 (out of 1 available) 44109 1727204229.50549: exiting _queue_task() for managed-node1/fail 44109 1727204229.50560: done queuing things up, now waiting for results queue to drain 44109 1727204229.50561: waiting for pending results... 44109 1727204229.50785: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 44109 1727204229.50940: in run() - task 028d2410-947f-ed67-a560-0000000001cd 44109 1727204229.50945: variable 'ansible_search_path' from source: unknown 44109 1727204229.50947: variable 'ansible_search_path' from source: unknown 44109 1727204229.50962: calling self._execute() 44109 1727204229.51050: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.51061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.51157: variable 'omit' from source: magic vars 44109 1727204229.51441: variable 'ansible_distribution_major_version' from source: facts 44109 1727204229.51459: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204229.51660: variable 'type' from source: set_fact 44109 1727204229.51672: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 44109 1727204229.51683: when evaluation is False, skipping this task 44109 1727204229.51704: _execute() done 44109 1727204229.51726: dumping result to json 44109 1727204229.51733: done dumping result, returning 44109 1727204229.51741: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [028d2410-947f-ed67-a560-0000000001cd] 44109 1727204229.51750: sending task result for task 028d2410-947f-ed67-a560-0000000001cd skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 44109 1727204229.51959: no more pending results, returning what we have 44109 1727204229.51963: results queue empty 44109 1727204229.51964: checking for any_errors_fatal 44109 1727204229.51970: done checking for any_errors_fatal 44109 1727204229.51971: checking for max_fail_percentage 44109 1727204229.51973: done checking for max_fail_percentage 44109 1727204229.51973: checking to see if all hosts have failed and the running result is not ok 44109 1727204229.51974: done checking to see if all hosts have failed 44109 1727204229.51976: getting the remaining hosts for this loop 44109 1727204229.51978: done getting the remaining hosts for this loop 44109 1727204229.51981: getting the next task for host managed-node1 44109 1727204229.51989: done getting next task for host managed-node1 44109 1727204229.51992: ^ task is: TASK: Include the task 'show_interfaces.yml' 44109 1727204229.51996: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204229.51999: getting variables 44109 1727204229.52000: in VariableManager get_vars() 44109 1727204229.52038: Calling all_inventory to load vars for managed-node1 44109 1727204229.52041: Calling groups_inventory to load vars for managed-node1 44109 1727204229.52043: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.52054: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.52057: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.52059: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.52517: done sending task result for task 028d2410-947f-ed67-a560-0000000001cd 44109 1727204229.52520: WORKER PROCESS EXITING 44109 1727204229.52542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.52751: done with get_vars() 44109 1727204229.52761: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:57:09 -0400 (0:00:00.029) 0:00:06.324 ***** 44109 1727204229.52846: entering _queue_task() for managed-node1/include_tasks 44109 1727204229.53304: worker is 1 (out of 1 available) 44109 1727204229.53310: exiting _queue_task() for managed-node1/include_tasks 44109 1727204229.53319: done queuing things up, now waiting for results queue to drain 44109 1727204229.53320: waiting for pending results... 44109 1727204229.53356: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 44109 1727204229.53462: in run() - task 028d2410-947f-ed67-a560-0000000001ce 44109 1727204229.53486: variable 'ansible_search_path' from source: unknown 44109 1727204229.53496: variable 'ansible_search_path' from source: unknown 44109 1727204229.53536: calling self._execute() 44109 1727204229.53622: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.53634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.53655: variable 'omit' from source: magic vars 44109 1727204229.54062: variable 'ansible_distribution_major_version' from source: facts 44109 1727204229.54082: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204229.54101: _execute() done 44109 1727204229.54109: dumping result to json 44109 1727204229.54117: done dumping result, returning 44109 1727204229.54127: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-ed67-a560-0000000001ce] 44109 1727204229.54137: sending task result for task 028d2410-947f-ed67-a560-0000000001ce 44109 1727204229.54346: no more pending results, returning what we have 44109 1727204229.54351: in VariableManager get_vars() 44109 1727204229.54393: Calling all_inventory to load vars for managed-node1 44109 1727204229.54396: Calling groups_inventory to load vars for managed-node1 44109 1727204229.54399: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.54412: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.54415: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.54419: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.54760: done sending task result for task 028d2410-947f-ed67-a560-0000000001ce 44109 1727204229.54763: WORKER PROCESS EXITING 44109 1727204229.54787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.54988: done with get_vars() 44109 1727204229.54994: variable 'ansible_search_path' from source: unknown 44109 1727204229.54995: variable 'ansible_search_path' from source: unknown 44109 1727204229.55034: we have included files to process 44109 1727204229.55036: generating all_blocks data 44109 1727204229.55038: done generating all_blocks data 44109 1727204229.55042: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44109 1727204229.55043: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44109 1727204229.55046: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44109 1727204229.55148: in VariableManager get_vars() 44109 1727204229.55169: done with get_vars() 44109 1727204229.55318: done processing included file 44109 1727204229.55320: iterating over new_blocks loaded from include file 44109 1727204229.55321: in VariableManager get_vars() 44109 1727204229.55337: done with get_vars() 44109 1727204229.55338: filtering new block on tags 44109 1727204229.55361: done filtering new block on tags 44109 1727204229.55363: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 44109 1727204229.55368: extending task lists for all hosts with included blocks 44109 1727204229.56102: done extending task lists 44109 1727204229.56103: done processing included files 44109 1727204229.56104: results queue empty 44109 1727204229.56105: checking for any_errors_fatal 44109 1727204229.56220: done checking for any_errors_fatal 44109 1727204229.56221: checking for max_fail_percentage 44109 1727204229.56223: done checking for max_fail_percentage 44109 1727204229.56224: checking to see if all hosts have failed and the running result is not ok 44109 1727204229.56224: done checking to see if all hosts have failed 44109 1727204229.56225: getting the remaining hosts for this loop 44109 1727204229.56226: done getting the remaining hosts for this loop 44109 1727204229.56229: getting the next task for host managed-node1 44109 1727204229.56234: done getting next task for host managed-node1 44109 1727204229.56236: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44109 1727204229.56239: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204229.56241: getting variables 44109 1727204229.56242: in VariableManager get_vars() 44109 1727204229.56252: Calling all_inventory to load vars for managed-node1 44109 1727204229.56254: Calling groups_inventory to load vars for managed-node1 44109 1727204229.56256: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.56261: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.56263: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.56266: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.56614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.56864: done with get_vars() 44109 1727204229.56873: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:57:09 -0400 (0:00:00.042) 0:00:06.366 ***** 44109 1727204229.57059: entering _queue_task() for managed-node1/include_tasks 44109 1727204229.57558: worker is 1 (out of 1 available) 44109 1727204229.57572: exiting _queue_task() for managed-node1/include_tasks 44109 1727204229.57685: done queuing things up, now waiting for results queue to drain 44109 1727204229.57687: waiting for pending results... 44109 1727204229.58050: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 44109 1727204229.58211: in run() - task 028d2410-947f-ed67-a560-000000000275 44109 1727204229.58242: variable 'ansible_search_path' from source: unknown 44109 1727204229.58350: variable 'ansible_search_path' from source: unknown 44109 1727204229.58353: calling self._execute() 44109 1727204229.58390: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.58401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.58413: variable 'omit' from source: magic vars 44109 1727204229.58808: variable 'ansible_distribution_major_version' from source: facts 44109 1727204229.58825: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204229.58836: _execute() done 44109 1727204229.58845: dumping result to json 44109 1727204229.58853: done dumping result, returning 44109 1727204229.58864: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-ed67-a560-000000000275] 44109 1727204229.58874: sending task result for task 028d2410-947f-ed67-a560-000000000275 44109 1727204229.59152: done sending task result for task 028d2410-947f-ed67-a560-000000000275 44109 1727204229.59155: WORKER PROCESS EXITING 44109 1727204229.59217: no more pending results, returning what we have 44109 1727204229.59223: in VariableManager get_vars() 44109 1727204229.59267: Calling all_inventory to load vars for managed-node1 44109 1727204229.59271: Calling groups_inventory to load vars for managed-node1 44109 1727204229.59274: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.59289: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.59292: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.59296: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.59690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.60112: done with get_vars() 44109 1727204229.60121: variable 'ansible_search_path' from source: unknown 44109 1727204229.60122: variable 'ansible_search_path' from source: unknown 44109 1727204229.60299: we have included files to process 44109 1727204229.60300: generating all_blocks data 44109 1727204229.60302: done generating all_blocks data 44109 1727204229.60304: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44109 1727204229.60305: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44109 1727204229.60307: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44109 1727204229.60942: done processing included file 44109 1727204229.60944: iterating over new_blocks loaded from include file 44109 1727204229.60945: in VariableManager get_vars() 44109 1727204229.60963: done with get_vars() 44109 1727204229.60971: filtering new block on tags 44109 1727204229.61130: done filtering new block on tags 44109 1727204229.61137: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 44109 1727204229.61143: extending task lists for all hosts with included blocks 44109 1727204229.61510: done extending task lists 44109 1727204229.61512: done processing included files 44109 1727204229.61513: results queue empty 44109 1727204229.61513: checking for any_errors_fatal 44109 1727204229.61516: done checking for any_errors_fatal 44109 1727204229.61517: checking for max_fail_percentage 44109 1727204229.61518: done checking for max_fail_percentage 44109 1727204229.61519: checking to see if all hosts have failed and the running result is not ok 44109 1727204229.61520: done checking to see if all hosts have failed 44109 1727204229.61521: getting the remaining hosts for this loop 44109 1727204229.61522: done getting the remaining hosts for this loop 44109 1727204229.61525: getting the next task for host managed-node1 44109 1727204229.61529: done getting next task for host managed-node1 44109 1727204229.61531: ^ task is: TASK: Gather current interface info 44109 1727204229.61534: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204229.61536: getting variables 44109 1727204229.61537: in VariableManager get_vars() 44109 1727204229.61549: Calling all_inventory to load vars for managed-node1 44109 1727204229.61551: Calling groups_inventory to load vars for managed-node1 44109 1727204229.61553: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204229.61558: Calling all_plugins_play to load vars for managed-node1 44109 1727204229.61560: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204229.61563: Calling groups_plugins_play to load vars for managed-node1 44109 1727204229.61998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204229.62522: done with get_vars() 44109 1727204229.62533: done getting variables 44109 1727204229.62585: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:57:09 -0400 (0:00:00.056) 0:00:06.423 ***** 44109 1727204229.62689: entering _queue_task() for managed-node1/command 44109 1727204229.63640: worker is 1 (out of 1 available) 44109 1727204229.63652: exiting _queue_task() for managed-node1/command 44109 1727204229.63661: done queuing things up, now waiting for results queue to drain 44109 1727204229.63662: waiting for pending results... 44109 1727204229.64297: running TaskExecutor() for managed-node1/TASK: Gather current interface info 44109 1727204229.64340: in run() - task 028d2410-947f-ed67-a560-0000000002ac 44109 1727204229.64409: variable 'ansible_search_path' from source: unknown 44109 1727204229.64528: variable 'ansible_search_path' from source: unknown 44109 1727204229.64532: calling self._execute() 44109 1727204229.64660: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.64722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.64774: variable 'omit' from source: magic vars 44109 1727204229.65523: variable 'ansible_distribution_major_version' from source: facts 44109 1727204229.65538: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204229.65595: variable 'omit' from source: magic vars 44109 1727204229.65804: variable 'omit' from source: magic vars 44109 1727204229.65808: variable 'omit' from source: magic vars 44109 1727204229.65811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204229.65813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204229.65830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204229.65853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204229.65870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204229.65912: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204229.65923: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.65932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.66040: Set connection var ansible_connection to ssh 44109 1727204229.66053: Set connection var ansible_timeout to 10 44109 1727204229.66063: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204229.66077: Set connection var ansible_pipelining to False 44109 1727204229.66089: Set connection var ansible_shell_executable to /bin/sh 44109 1727204229.66139: Set connection var ansible_shell_type to sh 44109 1727204229.66168: variable 'ansible_shell_executable' from source: unknown 44109 1727204229.66349: variable 'ansible_connection' from source: unknown 44109 1727204229.66353: variable 'ansible_module_compression' from source: unknown 44109 1727204229.66355: variable 'ansible_shell_type' from source: unknown 44109 1727204229.66358: variable 'ansible_shell_executable' from source: unknown 44109 1727204229.66360: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204229.66362: variable 'ansible_pipelining' from source: unknown 44109 1727204229.66364: variable 'ansible_timeout' from source: unknown 44109 1727204229.66366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204229.66472: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204229.66493: variable 'omit' from source: magic vars 44109 1727204229.66577: starting attempt loop 44109 1727204229.66785: running the handler 44109 1727204229.66788: _low_level_execute_command(): starting 44109 1727204229.66790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204229.68197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204229.68213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204229.68290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204229.68313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204229.68421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204229.70218: stdout chunk (state=3): >>>/root <<< 44109 1727204229.70363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204229.70378: stdout chunk (state=3): >>><<< 44109 1727204229.70397: stderr chunk (state=3): >>><<< 44109 1727204229.70691: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204229.70694: _low_level_execute_command(): starting 44109 1727204229.70698: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336 `" && echo ansible-tmp-1727204229.7060382-44851-238258344773336="` echo /root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336 `" ) && sleep 0' 44109 1727204229.71795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204229.71808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204229.71822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204229.71836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204229.71849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204229.71856: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204229.71867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204229.71899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204229.71902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204229.72016: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204229.72019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204229.72021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204229.72023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204229.72061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204229.72068: stderr chunk (state=3): >>>debug2: match found <<< 44109 1727204229.72071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204229.72073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204229.72077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204229.72271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204229.72402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204229.74507: stdout chunk (state=3): >>>ansible-tmp-1727204229.7060382-44851-238258344773336=/root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336 <<< 44109 1727204229.74641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204229.74698: stderr chunk (state=3): >>><<< 44109 1727204229.74781: stdout chunk (state=3): >>><<< 44109 1727204229.74785: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204229.7060382-44851-238258344773336=/root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204229.74787: variable 'ansible_module_compression' from source: unknown 44109 1727204229.74823: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204229.74857: variable 'ansible_facts' from source: unknown 44109 1727204229.74952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/AnsiballZ_command.py 44109 1727204229.75098: Sending initial data 44109 1727204229.75102: Sent initial data (156 bytes) 44109 1727204229.75784: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204229.75880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204229.75930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204229.75959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204229.76052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204229.77867: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204229.78092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204229.78095: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpw_zj7lf7 /root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/AnsiballZ_command.py <<< 44109 1727204229.78098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/AnsiballZ_command.py" <<< 44109 1727204229.78210: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpw_zj7lf7" to remote "/root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/AnsiballZ_command.py" <<< 44109 1727204229.81112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204229.81249: stderr chunk (state=3): >>><<< 44109 1727204229.81312: stdout chunk (state=3): >>><<< 44109 1727204229.81489: done transferring module to remote 44109 1727204229.81492: _low_level_execute_command(): starting 44109 1727204229.81495: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/ /root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/AnsiballZ_command.py && sleep 0' 44109 1727204229.82710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204229.82714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204229.82785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204229.82799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204229.83141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204229.83197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204229.83422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204229.85339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204229.85694: stderr chunk (state=3): >>><<< 44109 1727204229.85699: stdout chunk (state=3): >>><<< 44109 1727204229.85702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204229.85704: _low_level_execute_command(): starting 44109 1727204229.85707: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/AnsiballZ_command.py && sleep 0' 44109 1727204229.87191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204229.87428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204229.87442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204229.87493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204229.87804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204230.04518: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:10.039725", "end": "2024-09-24 14:57:10.043377", "delta": "0:00:00.003652", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204230.06225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204230.06289: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 44109 1727204230.06341: stderr chunk (state=3): >>><<< 44109 1727204230.06564: stdout chunk (state=3): >>><<< 44109 1727204230.06568: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:10.039725", "end": "2024-09-24 14:57:10.043377", "delta": "0:00:00.003652", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204230.06572: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204230.06575: _low_level_execute_command(): starting 44109 1727204230.06579: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204229.7060382-44851-238258344773336/ > /dev/null 2>&1 && sleep 0' 44109 1727204230.08005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204230.08040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204230.08068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204230.08088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204230.08201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204230.10217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204230.10228: stdout chunk (state=3): >>><<< 44109 1727204230.10241: stderr chunk (state=3): >>><<< 44109 1727204230.10263: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204230.10285: handler run complete 44109 1727204230.10339: Evaluated conditional (False): False 44109 1727204230.10439: attempt loop complete, returning result 44109 1727204230.10511: _execute() done 44109 1727204230.10519: dumping result to json 44109 1727204230.10538: done dumping result, returning 44109 1727204230.10551: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [028d2410-947f-ed67-a560-0000000002ac] 44109 1727204230.10785: sending task result for task 028d2410-947f-ed67-a560-0000000002ac 44109 1727204230.10858: done sending task result for task 028d2410-947f-ed67-a560-0000000002ac 44109 1727204230.10861: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003652", "end": "2024-09-24 14:57:10.043377", "rc": 0, "start": "2024-09-24 14:57:10.039725" } STDOUT: bonding_masters eth0 lo 44109 1727204230.10961: no more pending results, returning what we have 44109 1727204230.10965: results queue empty 44109 1727204230.10966: checking for any_errors_fatal 44109 1727204230.10967: done checking for any_errors_fatal 44109 1727204230.10968: checking for max_fail_percentage 44109 1727204230.10972: done checking for max_fail_percentage 44109 1727204230.10972: checking to see if all hosts have failed and the running result is not ok 44109 1727204230.10973: done checking to see if all hosts have failed 44109 1727204230.10974: getting the remaining hosts for this loop 44109 1727204230.11090: done getting the remaining hosts for this loop 44109 1727204230.11095: getting the next task for host managed-node1 44109 1727204230.11105: done getting next task for host managed-node1 44109 1727204230.11108: ^ task is: TASK: Set current_interfaces 44109 1727204230.11112: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204230.11117: getting variables 44109 1727204230.11119: in VariableManager get_vars() 44109 1727204230.11156: Calling all_inventory to load vars for managed-node1 44109 1727204230.11159: Calling groups_inventory to load vars for managed-node1 44109 1727204230.11161: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204230.11172: Calling all_plugins_play to load vars for managed-node1 44109 1727204230.11175: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204230.11269: Calling groups_plugins_play to load vars for managed-node1 44109 1727204230.12027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204230.12588: done with get_vars() 44109 1727204230.12790: done getting variables 44109 1727204230.12851: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:57:10 -0400 (0:00:00.501) 0:00:06.926 ***** 44109 1727204230.13008: entering _queue_task() for managed-node1/set_fact 44109 1727204230.13888: worker is 1 (out of 1 available) 44109 1727204230.13899: exiting _queue_task() for managed-node1/set_fact 44109 1727204230.13909: done queuing things up, now waiting for results queue to drain 44109 1727204230.13910: waiting for pending results... 44109 1727204230.14146: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 44109 1727204230.14194: in run() - task 028d2410-947f-ed67-a560-0000000002ad 44109 1727204230.14220: variable 'ansible_search_path' from source: unknown 44109 1727204230.14240: variable 'ansible_search_path' from source: unknown 44109 1727204230.14328: calling self._execute() 44109 1727204230.14719: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.14723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.14725: variable 'omit' from source: magic vars 44109 1727204230.15407: variable 'ansible_distribution_major_version' from source: facts 44109 1727204230.15484: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204230.15496: variable 'omit' from source: magic vars 44109 1727204230.15632: variable 'omit' from source: magic vars 44109 1727204230.15960: variable '_current_interfaces' from source: set_fact 44109 1727204230.16284: variable 'omit' from source: magic vars 44109 1727204230.16405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204230.16526: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204230.16588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204230.16630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204230.16733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204230.16768: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204230.16779: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.16789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.17023: Set connection var ansible_connection to ssh 44109 1727204230.17115: Set connection var ansible_timeout to 10 44109 1727204230.17282: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204230.17286: Set connection var ansible_pipelining to False 44109 1727204230.17288: Set connection var ansible_shell_executable to /bin/sh 44109 1727204230.17290: Set connection var ansible_shell_type to sh 44109 1727204230.17292: variable 'ansible_shell_executable' from source: unknown 44109 1727204230.17295: variable 'ansible_connection' from source: unknown 44109 1727204230.17297: variable 'ansible_module_compression' from source: unknown 44109 1727204230.17299: variable 'ansible_shell_type' from source: unknown 44109 1727204230.17301: variable 'ansible_shell_executable' from source: unknown 44109 1727204230.17304: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.17306: variable 'ansible_pipelining' from source: unknown 44109 1727204230.17308: variable 'ansible_timeout' from source: unknown 44109 1727204230.17310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.17540: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204230.17557: variable 'omit' from source: magic vars 44109 1727204230.17568: starting attempt loop 44109 1727204230.17577: running the handler 44109 1727204230.17593: handler run complete 44109 1727204230.17608: attempt loop complete, returning result 44109 1727204230.17614: _execute() done 44109 1727204230.17620: dumping result to json 44109 1727204230.17627: done dumping result, returning 44109 1727204230.17638: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [028d2410-947f-ed67-a560-0000000002ad] 44109 1727204230.17652: sending task result for task 028d2410-947f-ed67-a560-0000000002ad 44109 1727204230.17881: done sending task result for task 028d2410-947f-ed67-a560-0000000002ad 44109 1727204230.17885: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 44109 1727204230.17943: no more pending results, returning what we have 44109 1727204230.17947: results queue empty 44109 1727204230.17947: checking for any_errors_fatal 44109 1727204230.17958: done checking for any_errors_fatal 44109 1727204230.17958: checking for max_fail_percentage 44109 1727204230.17960: done checking for max_fail_percentage 44109 1727204230.17960: checking to see if all hosts have failed and the running result is not ok 44109 1727204230.17961: done checking to see if all hosts have failed 44109 1727204230.17962: getting the remaining hosts for this loop 44109 1727204230.17963: done getting the remaining hosts for this loop 44109 1727204230.17966: getting the next task for host managed-node1 44109 1727204230.17973: done getting next task for host managed-node1 44109 1727204230.18100: ^ task is: TASK: Show current_interfaces 44109 1727204230.18105: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204230.18110: getting variables 44109 1727204230.18114: in VariableManager get_vars() 44109 1727204230.18150: Calling all_inventory to load vars for managed-node1 44109 1727204230.18153: Calling groups_inventory to load vars for managed-node1 44109 1727204230.18155: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204230.18165: Calling all_plugins_play to load vars for managed-node1 44109 1727204230.18168: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204230.18171: Calling groups_plugins_play to load vars for managed-node1 44109 1727204230.18371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204230.18568: done with get_vars() 44109 1727204230.18583: done getting variables 44109 1727204230.18651: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:57:10 -0400 (0:00:00.056) 0:00:06.983 ***** 44109 1727204230.18685: entering _queue_task() for managed-node1/debug 44109 1727204230.19013: worker is 1 (out of 1 available) 44109 1727204230.19024: exiting _queue_task() for managed-node1/debug 44109 1727204230.19036: done queuing things up, now waiting for results queue to drain 44109 1727204230.19036: waiting for pending results... 44109 1727204230.19289: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 44109 1727204230.19452: in run() - task 028d2410-947f-ed67-a560-000000000276 44109 1727204230.19488: variable 'ansible_search_path' from source: unknown 44109 1727204230.19504: variable 'ansible_search_path' from source: unknown 44109 1727204230.19633: calling self._execute() 44109 1727204230.19663: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.19674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.19691: variable 'omit' from source: magic vars 44109 1727204230.20101: variable 'ansible_distribution_major_version' from source: facts 44109 1727204230.20121: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204230.20132: variable 'omit' from source: magic vars 44109 1727204230.20195: variable 'omit' from source: magic vars 44109 1727204230.20301: variable 'current_interfaces' from source: set_fact 44109 1727204230.20338: variable 'omit' from source: magic vars 44109 1727204230.20394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204230.20481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204230.20484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204230.20486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204230.20488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204230.20528: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204230.20537: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.20545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.20663: Set connection var ansible_connection to ssh 44109 1727204230.20677: Set connection var ansible_timeout to 10 44109 1727204230.20688: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204230.20701: Set connection var ansible_pipelining to False 44109 1727204230.20723: Set connection var ansible_shell_executable to /bin/sh 44109 1727204230.20831: Set connection var ansible_shell_type to sh 44109 1727204230.20834: variable 'ansible_shell_executable' from source: unknown 44109 1727204230.20837: variable 'ansible_connection' from source: unknown 44109 1727204230.20839: variable 'ansible_module_compression' from source: unknown 44109 1727204230.20841: variable 'ansible_shell_type' from source: unknown 44109 1727204230.20843: variable 'ansible_shell_executable' from source: unknown 44109 1727204230.20845: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.20847: variable 'ansible_pipelining' from source: unknown 44109 1727204230.20848: variable 'ansible_timeout' from source: unknown 44109 1727204230.20850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.21000: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204230.21019: variable 'omit' from source: magic vars 44109 1727204230.21045: starting attempt loop 44109 1727204230.21060: running the handler 44109 1727204230.21166: handler run complete 44109 1727204230.21263: attempt loop complete, returning result 44109 1727204230.21268: _execute() done 44109 1727204230.21271: dumping result to json 44109 1727204230.21273: done dumping result, returning 44109 1727204230.21278: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [028d2410-947f-ed67-a560-000000000276] 44109 1727204230.21281: sending task result for task 028d2410-947f-ed67-a560-000000000276 44109 1727204230.21351: done sending task result for task 028d2410-947f-ed67-a560-000000000276 44109 1727204230.21354: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 44109 1727204230.21406: no more pending results, returning what we have 44109 1727204230.21410: results queue empty 44109 1727204230.21411: checking for any_errors_fatal 44109 1727204230.21419: done checking for any_errors_fatal 44109 1727204230.21420: checking for max_fail_percentage 44109 1727204230.21421: done checking for max_fail_percentage 44109 1727204230.21422: checking to see if all hosts have failed and the running result is not ok 44109 1727204230.21423: done checking to see if all hosts have failed 44109 1727204230.21424: getting the remaining hosts for this loop 44109 1727204230.21425: done getting the remaining hosts for this loop 44109 1727204230.21429: getting the next task for host managed-node1 44109 1727204230.21437: done getting next task for host managed-node1 44109 1727204230.21439: ^ task is: TASK: Install iproute 44109 1727204230.21442: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204230.21446: getting variables 44109 1727204230.21448: in VariableManager get_vars() 44109 1727204230.21484: Calling all_inventory to load vars for managed-node1 44109 1727204230.21487: Calling groups_inventory to load vars for managed-node1 44109 1727204230.21489: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204230.21499: Calling all_plugins_play to load vars for managed-node1 44109 1727204230.21501: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204230.21505: Calling groups_plugins_play to load vars for managed-node1 44109 1727204230.21987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204230.22188: done with get_vars() 44109 1727204230.22198: done getting variables 44109 1727204230.22262: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:57:10 -0400 (0:00:00.036) 0:00:07.019 ***** 44109 1727204230.22294: entering _queue_task() for managed-node1/package 44109 1727204230.22680: worker is 1 (out of 1 available) 44109 1727204230.22691: exiting _queue_task() for managed-node1/package 44109 1727204230.22702: done queuing things up, now waiting for results queue to drain 44109 1727204230.22703: waiting for pending results... 44109 1727204230.22997: running TaskExecutor() for managed-node1/TASK: Install iproute 44109 1727204230.23002: in run() - task 028d2410-947f-ed67-a560-0000000001cf 44109 1727204230.23005: variable 'ansible_search_path' from source: unknown 44109 1727204230.23007: variable 'ansible_search_path' from source: unknown 44109 1727204230.23095: calling self._execute() 44109 1727204230.23126: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.23137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.23150: variable 'omit' from source: magic vars 44109 1727204230.23533: variable 'ansible_distribution_major_version' from source: facts 44109 1727204230.23550: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204230.23562: variable 'omit' from source: magic vars 44109 1727204230.23823: variable 'omit' from source: magic vars 44109 1727204230.24287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204230.28142: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204230.28554: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204230.28613: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204230.28652: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204230.28685: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204230.28808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204230.28851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204230.28886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204230.29027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204230.29030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204230.29085: variable '__network_is_ostree' from source: set_fact 44109 1727204230.29095: variable 'omit' from source: magic vars 44109 1727204230.29131: variable 'omit' from source: magic vars 44109 1727204230.29249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204230.29259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204230.29262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204230.29265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204230.29271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204230.29310: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204230.29321: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.29357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.29466: Set connection var ansible_connection to ssh 44109 1727204230.29470: Set connection var ansible_timeout to 10 44109 1727204230.29473: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204230.29477: Set connection var ansible_pipelining to False 44109 1727204230.29646: Set connection var ansible_shell_executable to /bin/sh 44109 1727204230.29650: Set connection var ansible_shell_type to sh 44109 1727204230.29652: variable 'ansible_shell_executable' from source: unknown 44109 1727204230.29654: variable 'ansible_connection' from source: unknown 44109 1727204230.29656: variable 'ansible_module_compression' from source: unknown 44109 1727204230.29657: variable 'ansible_shell_type' from source: unknown 44109 1727204230.29659: variable 'ansible_shell_executable' from source: unknown 44109 1727204230.29661: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204230.29662: variable 'ansible_pipelining' from source: unknown 44109 1727204230.29688: variable 'ansible_timeout' from source: unknown 44109 1727204230.29953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204230.29974: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204230.29993: variable 'omit' from source: magic vars 44109 1727204230.30004: starting attempt loop 44109 1727204230.30011: running the handler 44109 1727204230.30022: variable 'ansible_facts' from source: unknown 44109 1727204230.30030: variable 'ansible_facts' from source: unknown 44109 1727204230.30081: _low_level_execute_command(): starting 44109 1727204230.30094: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204230.30877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204230.30896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204230.30956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204230.30999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204230.31025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204230.31053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204230.31272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204230.32986: stdout chunk (state=3): >>>/root <<< 44109 1727204230.33150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204230.33153: stdout chunk (state=3): >>><<< 44109 1727204230.33155: stderr chunk (state=3): >>><<< 44109 1727204230.33266: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204230.33279: _low_level_execute_command(): starting 44109 1727204230.33282: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439 `" && echo ansible-tmp-1727204230.331801-44934-15127564348439="` echo /root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439 `" ) && sleep 0' 44109 1727204230.33860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204230.33873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204230.33890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204230.33906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204230.33995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204230.34035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204230.34051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204230.34069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204230.34183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204230.36340: stdout chunk (state=3): >>>ansible-tmp-1727204230.331801-44934-15127564348439=/root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439 <<< 44109 1727204230.36488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204230.36507: stderr chunk (state=3): >>><<< 44109 1727204230.36516: stdout chunk (state=3): >>><<< 44109 1727204230.36539: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204230.331801-44934-15127564348439=/root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204230.36572: variable 'ansible_module_compression' from source: unknown 44109 1727204230.36684: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 44109 1727204230.36687: ANSIBALLZ: Acquiring lock 44109 1727204230.36690: ANSIBALLZ: Lock acquired: 139907468546112 44109 1727204230.36692: ANSIBALLZ: Creating module 44109 1727204230.61732: ANSIBALLZ: Writing module into payload 44109 1727204230.61913: ANSIBALLZ: Writing module 44109 1727204230.61931: ANSIBALLZ: Renaming module 44109 1727204230.61945: ANSIBALLZ: Done creating module 44109 1727204230.61965: variable 'ansible_facts' from source: unknown 44109 1727204230.62095: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/AnsiballZ_dnf.py 44109 1727204230.62218: Sending initial data 44109 1727204230.62222: Sent initial data (150 bytes) 44109 1727204230.62827: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204230.62861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204230.62871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204230.62892: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204230.62999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204230.63018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204230.63229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204230.64978: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44109 1727204230.64987: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44109 1727204230.65015: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204230.65102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204230.65202: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpx8cs006k /root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/AnsiballZ_dnf.py <<< 44109 1727204230.65206: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/AnsiballZ_dnf.py" <<< 44109 1727204230.65271: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpx8cs006k" to remote "/root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/AnsiballZ_dnf.py" <<< 44109 1727204230.66802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204230.66810: stdout chunk (state=3): >>><<< 44109 1727204230.66813: stderr chunk (state=3): >>><<< 44109 1727204230.66908: done transferring module to remote 44109 1727204230.66911: _low_level_execute_command(): starting 44109 1727204230.66922: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/ /root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/AnsiballZ_dnf.py && sleep 0' 44109 1727204230.67611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204230.67625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204230.67641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204230.67672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204230.67790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204230.67887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204230.70060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204230.70063: stdout chunk (state=3): >>><<< 44109 1727204230.70066: stderr chunk (state=3): >>><<< 44109 1727204230.70068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204230.70233: _low_level_execute_command(): starting 44109 1727204230.70237: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/AnsiballZ_dnf.py && sleep 0' 44109 1727204230.71496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204230.71545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204230.71579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204230.71794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204230.71898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.17520: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 44109 1727204231.32609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204231.32623: stdout chunk (state=3): >>><<< 44109 1727204231.32626: stderr chunk (state=3): >>><<< 44109 1727204231.32632: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204231.32717: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204231.32774: _low_level_execute_command(): starting 44109 1727204231.32984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204230.331801-44934-15127564348439/ > /dev/null 2>&1 && sleep 0' 44109 1727204231.34491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204231.34497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204231.34515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204231.34530: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204231.34545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204231.34593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204231.34774: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204231.34844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204231.34942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.37012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204231.37016: stdout chunk (state=3): >>><<< 44109 1727204231.37018: stderr chunk (state=3): >>><<< 44109 1727204231.37128: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204231.37132: handler run complete 44109 1727204231.37817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204231.38223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204231.38583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204231.38651: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204231.38993: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204231.39247: variable '__install_status' from source: unknown 44109 1727204231.39310: Evaluated conditional (__install_status is success): True 44109 1727204231.39401: attempt loop complete, returning result 44109 1727204231.39404: _execute() done 44109 1727204231.39407: dumping result to json 44109 1727204231.39409: done dumping result, returning 44109 1727204231.39999: done running TaskExecutor() for managed-node1/TASK: Install iproute [028d2410-947f-ed67-a560-0000000001cf] 44109 1727204231.40002: sending task result for task 028d2410-947f-ed67-a560-0000000001cf 44109 1727204231.40073: done sending task result for task 028d2410-947f-ed67-a560-0000000001cf 44109 1727204231.40079: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 44109 1727204231.40192: no more pending results, returning what we have 44109 1727204231.40196: results queue empty 44109 1727204231.40197: checking for any_errors_fatal 44109 1727204231.40202: done checking for any_errors_fatal 44109 1727204231.40210: checking for max_fail_percentage 44109 1727204231.40212: done checking for max_fail_percentage 44109 1727204231.40213: checking to see if all hosts have failed and the running result is not ok 44109 1727204231.40214: done checking to see if all hosts have failed 44109 1727204231.40214: getting the remaining hosts for this loop 44109 1727204231.40216: done getting the remaining hosts for this loop 44109 1727204231.40220: getting the next task for host managed-node1 44109 1727204231.40226: done getting next task for host managed-node1 44109 1727204231.40229: ^ task is: TASK: Create veth interface {{ interface }} 44109 1727204231.40232: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204231.40236: getting variables 44109 1727204231.40238: in VariableManager get_vars() 44109 1727204231.40274: Calling all_inventory to load vars for managed-node1 44109 1727204231.40668: Calling groups_inventory to load vars for managed-node1 44109 1727204231.40672: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204231.40687: Calling all_plugins_play to load vars for managed-node1 44109 1727204231.40690: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204231.40693: Calling groups_plugins_play to load vars for managed-node1 44109 1727204231.41248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204231.41765: done with get_vars() 44109 1727204231.41778: done getting variables 44109 1727204231.41829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204231.42183: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:57:11 -0400 (0:00:01.199) 0:00:08.218 ***** 44109 1727204231.42213: entering _queue_task() for managed-node1/command 44109 1727204231.42774: worker is 1 (out of 1 available) 44109 1727204231.42790: exiting _queue_task() for managed-node1/command 44109 1727204231.42802: done queuing things up, now waiting for results queue to drain 44109 1727204231.42803: waiting for pending results... 44109 1727204231.43493: running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 44109 1727204231.43498: in run() - task 028d2410-947f-ed67-a560-0000000001d0 44109 1727204231.43501: variable 'ansible_search_path' from source: unknown 44109 1727204231.43504: variable 'ansible_search_path' from source: unknown 44109 1727204231.44118: variable 'interface' from source: set_fact 44109 1727204231.44203: variable 'interface' from source: set_fact 44109 1727204231.44279: variable 'interface' from source: set_fact 44109 1727204231.44420: Loaded config def from plugin (lookup/items) 44109 1727204231.44432: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 44109 1727204231.44458: variable 'omit' from source: magic vars 44109 1727204231.44570: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204231.44585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204231.44599: variable 'omit' from source: magic vars 44109 1727204231.44886: variable 'ansible_distribution_major_version' from source: facts 44109 1727204231.44898: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204231.45102: variable 'type' from source: set_fact 44109 1727204231.45114: variable 'state' from source: include params 44109 1727204231.45282: variable 'interface' from source: set_fact 44109 1727204231.45285: variable 'current_interfaces' from source: set_fact 44109 1727204231.45287: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 44109 1727204231.45289: variable 'omit' from source: magic vars 44109 1727204231.45291: variable 'omit' from source: magic vars 44109 1727204231.45293: variable 'item' from source: unknown 44109 1727204231.45294: variable 'item' from source: unknown 44109 1727204231.45308: variable 'omit' from source: magic vars 44109 1727204231.45338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204231.45367: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204231.45388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204231.45410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204231.45424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204231.45454: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204231.45461: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204231.45467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204231.45563: Set connection var ansible_connection to ssh 44109 1727204231.45623: Set connection var ansible_timeout to 10 44109 1727204231.45626: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204231.45628: Set connection var ansible_pipelining to False 44109 1727204231.45630: Set connection var ansible_shell_executable to /bin/sh 44109 1727204231.45632: Set connection var ansible_shell_type to sh 44109 1727204231.45634: variable 'ansible_shell_executable' from source: unknown 44109 1727204231.45635: variable 'ansible_connection' from source: unknown 44109 1727204231.45637: variable 'ansible_module_compression' from source: unknown 44109 1727204231.45639: variable 'ansible_shell_type' from source: unknown 44109 1727204231.45645: variable 'ansible_shell_executable' from source: unknown 44109 1727204231.45650: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204231.45657: variable 'ansible_pipelining' from source: unknown 44109 1727204231.45662: variable 'ansible_timeout' from source: unknown 44109 1727204231.45668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204231.45801: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204231.45817: variable 'omit' from source: magic vars 44109 1727204231.45826: starting attempt loop 44109 1727204231.45839: running the handler 44109 1727204231.45948: _low_level_execute_command(): starting 44109 1727204231.45951: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204231.46556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204231.46613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204231.46683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204231.46712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204231.46736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204231.46844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.49017: stdout chunk (state=3): >>>/root <<< 44109 1727204231.49022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204231.49025: stdout chunk (state=3): >>><<< 44109 1727204231.49027: stderr chunk (state=3): >>><<< 44109 1727204231.49030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204231.49032: _low_level_execute_command(): starting 44109 1727204231.49034: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535 `" && echo ansible-tmp-1727204231.489264-44986-99597693322535="` echo /root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535 `" ) && sleep 0' 44109 1727204231.50314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204231.50395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204231.50398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204231.50466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204231.50666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.52816: stdout chunk (state=3): >>>ansible-tmp-1727204231.489264-44986-99597693322535=/root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535 <<< 44109 1727204231.53098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204231.53136: stderr chunk (state=3): >>><<< 44109 1727204231.53190: stdout chunk (state=3): >>><<< 44109 1727204231.53217: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204231.489264-44986-99597693322535=/root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204231.53381: variable 'ansible_module_compression' from source: unknown 44109 1727204231.53411: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204231.53449: variable 'ansible_facts' from source: unknown 44109 1727204231.53723: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/AnsiballZ_command.py 44109 1727204231.54189: Sending initial data 44109 1727204231.54193: Sent initial data (154 bytes) 44109 1727204231.55235: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204231.55249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204231.55261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204231.55414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204231.55494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204231.55555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.57426: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204231.57500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204231.57585: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpc93mvzp7 /root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/AnsiballZ_command.py <<< 44109 1727204231.57588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/AnsiballZ_command.py" <<< 44109 1727204231.57644: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpc93mvzp7" to remote "/root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/AnsiballZ_command.py" <<< 44109 1727204231.59280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204231.59442: stderr chunk (state=3): >>><<< 44109 1727204231.59446: stdout chunk (state=3): >>><<< 44109 1727204231.59448: done transferring module to remote 44109 1727204231.59450: _low_level_execute_command(): starting 44109 1727204231.59452: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/ /root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/AnsiballZ_command.py && sleep 0' 44109 1727204231.60761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204231.60872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204231.61089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204231.61110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204231.61133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204231.61316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.63373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204231.63489: stderr chunk (state=3): >>><<< 44109 1727204231.63497: stdout chunk (state=3): >>><<< 44109 1727204231.63520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204231.63531: _low_level_execute_command(): starting 44109 1727204231.63541: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/AnsiballZ_command.py && sleep 0' 44109 1727204231.64894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204231.65016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204231.65034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204231.65151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.82602: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 14:57:11.813561", "end": "2024-09-24 14:57:11.818878", "delta": "0:00:00.005317", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204231.85793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204231.85821: stdout chunk (state=3): >>><<< 44109 1727204231.85836: stderr chunk (state=3): >>><<< 44109 1727204231.85860: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 14:57:11.813561", "end": "2024-09-24 14:57:11.818878", "delta": "0:00:00.005317", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204231.85933: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204231.85948: _low_level_execute_command(): starting 44109 1727204231.85957: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204231.489264-44986-99597693322535/ > /dev/null 2>&1 && sleep 0' 44109 1727204231.86731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204231.86844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204231.86889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204231.87010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.91405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204231.91437: stdout chunk (state=3): >>><<< 44109 1727204231.91465: stderr chunk (state=3): >>><<< 44109 1727204231.91496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204231.91563: handler run complete 44109 1727204231.91695: Evaluated conditional (False): False 44109 1727204231.91699: attempt loop complete, returning result 44109 1727204231.91701: variable 'item' from source: unknown 44109 1727204231.91952: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005317", "end": "2024-09-24 14:57:11.818878", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 14:57:11.813561" } 44109 1727204231.92582: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204231.92585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204231.92587: variable 'omit' from source: magic vars 44109 1727204231.92590: variable 'ansible_distribution_major_version' from source: facts 44109 1727204231.92592: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204231.93132: variable 'type' from source: set_fact 44109 1727204231.93135: variable 'state' from source: include params 44109 1727204231.93137: variable 'interface' from source: set_fact 44109 1727204231.93140: variable 'current_interfaces' from source: set_fact 44109 1727204231.93142: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 44109 1727204231.93144: variable 'omit' from source: magic vars 44109 1727204231.93146: variable 'omit' from source: magic vars 44109 1727204231.93246: variable 'item' from source: unknown 44109 1727204231.93494: variable 'item' from source: unknown 44109 1727204231.93502: variable 'omit' from source: magic vars 44109 1727204231.93506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204231.93637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204231.93698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204231.93734: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204231.93819: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204231.93823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204231.93944: Set connection var ansible_connection to ssh 44109 1727204231.93962: Set connection var ansible_timeout to 10 44109 1727204231.93972: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204231.93987: Set connection var ansible_pipelining to False 44109 1727204231.93996: Set connection var ansible_shell_executable to /bin/sh 44109 1727204231.94025: Set connection var ansible_shell_type to sh 44109 1727204231.94040: variable 'ansible_shell_executable' from source: unknown 44109 1727204231.94047: variable 'ansible_connection' from source: unknown 44109 1727204231.94054: variable 'ansible_module_compression' from source: unknown 44109 1727204231.94134: variable 'ansible_shell_type' from source: unknown 44109 1727204231.94139: variable 'ansible_shell_executable' from source: unknown 44109 1727204231.94142: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204231.94144: variable 'ansible_pipelining' from source: unknown 44109 1727204231.94146: variable 'ansible_timeout' from source: unknown 44109 1727204231.94148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204231.94424: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204231.94440: variable 'omit' from source: magic vars 44109 1727204231.94460: starting attempt loop 44109 1727204231.94463: running the handler 44109 1727204231.94513: _low_level_execute_command(): starting 44109 1727204231.94516: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204231.95893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204231.95897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204231.96007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204231.96138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204231.96167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204231.96298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204231.98093: stdout chunk (state=3): >>>/root <<< 44109 1727204231.98260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204231.98266: stdout chunk (state=3): >>><<< 44109 1727204231.98268: stderr chunk (state=3): >>><<< 44109 1727204231.98379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204231.98388: _low_level_execute_command(): starting 44109 1727204231.98391: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560 `" && echo ansible-tmp-1727204231.9829648-44986-27690733239560="` echo /root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560 `" ) && sleep 0' 44109 1727204231.99287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204231.99320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204231.99337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204231.99450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204231.99495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204231.99588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.01700: stdout chunk (state=3): >>>ansible-tmp-1727204231.9829648-44986-27690733239560=/root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560 <<< 44109 1727204232.01920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.02180: stdout chunk (state=3): >>><<< 44109 1727204232.02185: stderr chunk (state=3): >>><<< 44109 1727204232.02188: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204231.9829648-44986-27690733239560=/root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.02190: variable 'ansible_module_compression' from source: unknown 44109 1727204232.02197: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204232.02200: variable 'ansible_facts' from source: unknown 44109 1727204232.02203: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/AnsiballZ_command.py 44109 1727204232.02513: Sending initial data 44109 1727204232.02523: Sent initial data (155 bytes) 44109 1727204232.03768: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.03849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.03867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.03922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.04071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.05831: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204232.06005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204232.06110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpkug7ka49 /root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/AnsiballZ_command.py <<< 44109 1727204232.06239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/AnsiballZ_command.py" <<< 44109 1727204232.06242: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpkug7ka49" to remote "/root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/AnsiballZ_command.py" <<< 44109 1727204232.07710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.07781: stderr chunk (state=3): >>><<< 44109 1727204232.07808: stdout chunk (state=3): >>><<< 44109 1727204232.07860: done transferring module to remote 44109 1727204232.07956: _low_level_execute_command(): starting 44109 1727204232.07970: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/ /root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/AnsiballZ_command.py && sleep 0' 44109 1727204232.08592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.08599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.08601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204232.08664: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.08719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.08728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.08835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.11106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.11110: stdout chunk (state=3): >>><<< 44109 1727204232.11127: stderr chunk (state=3): >>><<< 44109 1727204232.11130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.11133: _low_level_execute_command(): starting 44109 1727204232.11135: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/AnsiballZ_command.py && sleep 0' 44109 1727204232.11920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.11965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.11980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.12129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.29217: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 14:57:12.284847", "end": "2024-09-24 14:57:12.288864", "delta": "0:00:00.004017", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204232.31183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204232.31187: stdout chunk (state=3): >>><<< 44109 1727204232.31189: stderr chunk (state=3): >>><<< 44109 1727204232.31192: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 14:57:12.284847", "end": "2024-09-24 14:57:12.288864", "delta": "0:00:00.004017", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204232.31195: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204232.31197: _low_level_execute_command(): starting 44109 1727204232.31199: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204231.9829648-44986-27690733239560/ > /dev/null 2>&1 && sleep 0' 44109 1727204232.32883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.32887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.32890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.33047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.33125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.35345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.35349: stdout chunk (state=3): >>><<< 44109 1727204232.35351: stderr chunk (state=3): >>><<< 44109 1727204232.35372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.35378: handler run complete 44109 1727204232.35398: Evaluated conditional (False): False 44109 1727204232.35407: attempt loop complete, returning result 44109 1727204232.35428: variable 'item' from source: unknown 44109 1727204232.35518: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.004017", "end": "2024-09-24 14:57:12.288864", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 14:57:12.284847" } 44109 1727204232.35900: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204232.35903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204232.35906: variable 'omit' from source: magic vars 44109 1727204232.36185: variable 'ansible_distribution_major_version' from source: facts 44109 1727204232.36188: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204232.36582: variable 'type' from source: set_fact 44109 1727204232.36585: variable 'state' from source: include params 44109 1727204232.36587: variable 'interface' from source: set_fact 44109 1727204232.36589: variable 'current_interfaces' from source: set_fact 44109 1727204232.36591: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 44109 1727204232.36597: variable 'omit' from source: magic vars 44109 1727204232.36780: variable 'omit' from source: magic vars 44109 1727204232.36783: variable 'item' from source: unknown 44109 1727204232.36863: variable 'item' from source: unknown 44109 1727204232.36897: variable 'omit' from source: magic vars 44109 1727204232.36969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204232.36985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204232.37061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204232.37180: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204232.37183: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204232.37186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204232.37283: Set connection var ansible_connection to ssh 44109 1727204232.37286: Set connection var ansible_timeout to 10 44109 1727204232.37293: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204232.37302: Set connection var ansible_pipelining to False 44109 1727204232.37309: Set connection var ansible_shell_executable to /bin/sh 44109 1727204232.37316: Set connection var ansible_shell_type to sh 44109 1727204232.37336: variable 'ansible_shell_executable' from source: unknown 44109 1727204232.37382: variable 'ansible_connection' from source: unknown 44109 1727204232.37397: variable 'ansible_module_compression' from source: unknown 44109 1727204232.37404: variable 'ansible_shell_type' from source: unknown 44109 1727204232.37410: variable 'ansible_shell_executable' from source: unknown 44109 1727204232.37417: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204232.37424: variable 'ansible_pipelining' from source: unknown 44109 1727204232.37580: variable 'ansible_timeout' from source: unknown 44109 1727204232.37583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204232.37587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204232.37794: variable 'omit' from source: magic vars 44109 1727204232.37797: starting attempt loop 44109 1727204232.37800: running the handler 44109 1727204232.37801: _low_level_execute_command(): starting 44109 1727204232.37805: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204232.38937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.38974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.39170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.39192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.39558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.39667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.41480: stdout chunk (state=3): >>>/root <<< 44109 1727204232.41694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.41697: stdout chunk (state=3): >>><<< 44109 1727204232.41704: stderr chunk (state=3): >>><<< 44109 1727204232.41726: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.41734: _low_level_execute_command(): starting 44109 1727204232.41740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553 `" && echo ansible-tmp-1727204232.4172473-44986-176184150136553="` echo /root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553 `" ) && sleep 0' 44109 1727204232.43555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.43560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204232.43682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.43685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.43890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.44020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.44411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.46527: stdout chunk (state=3): >>>ansible-tmp-1727204232.4172473-44986-176184150136553=/root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553 <<< 44109 1727204232.46838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.46841: stdout chunk (state=3): >>><<< 44109 1727204232.46843: stderr chunk (state=3): >>><<< 44109 1727204232.46846: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204232.4172473-44986-176184150136553=/root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.46848: variable 'ansible_module_compression' from source: unknown 44109 1727204232.46851: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204232.46852: variable 'ansible_facts' from source: unknown 44109 1727204232.46911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/AnsiballZ_command.py 44109 1727204232.47197: Sending initial data 44109 1727204232.47207: Sent initial data (156 bytes) 44109 1727204232.48899: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.49309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.49431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.49498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.51246: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44109 1727204232.51290: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204232.51442: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204232.51445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/AnsiballZ_command.py" <<< 44109 1727204232.51447: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpy_yvo6mw /root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/AnsiballZ_command.py <<< 44109 1727204232.51527: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpy_yvo6mw" to remote "/root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/AnsiballZ_command.py" <<< 44109 1727204232.52983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.53183: stderr chunk (state=3): >>><<< 44109 1727204232.53186: stdout chunk (state=3): >>><<< 44109 1727204232.53271: done transferring module to remote 44109 1727204232.53288: _low_level_execute_command(): starting 44109 1727204232.53298: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/ /root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/AnsiballZ_command.py && sleep 0' 44109 1727204232.54838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.54846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.55005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.55053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.55089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.55272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.57330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.57334: stdout chunk (state=3): >>><<< 44109 1727204232.57336: stderr chunk (state=3): >>><<< 44109 1727204232.57339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.57342: _low_level_execute_command(): starting 44109 1727204232.57534: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/AnsiballZ_command.py && sleep 0' 44109 1727204232.58723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.58726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.58729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.58731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.58780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.59244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.59247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.76179: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 14:57:12.755135", "end": "2024-09-24 14:57:12.759099", "delta": "0:00:00.003964", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204232.77984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204232.77990: stdout chunk (state=3): >>><<< 44109 1727204232.77992: stderr chunk (state=3): >>><<< 44109 1727204232.78131: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 14:57:12.755135", "end": "2024-09-24 14:57:12.759099", "delta": "0:00:00.003964", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204232.78135: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204232.78137: _low_level_execute_command(): starting 44109 1727204232.78140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204232.4172473-44986-176184150136553/ > /dev/null 2>&1 && sleep 0' 44109 1727204232.78683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.78695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.78707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204232.79084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.79088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.79090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.79092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.79094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.80898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.80950: stderr chunk (state=3): >>><<< 44109 1727204232.80960: stdout chunk (state=3): >>><<< 44109 1727204232.80981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.80984: handler run complete 44109 1727204232.81007: Evaluated conditional (False): False 44109 1727204232.81016: attempt loop complete, returning result 44109 1727204232.81036: variable 'item' from source: unknown 44109 1727204232.81117: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003964", "end": "2024-09-24 14:57:12.759099", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 14:57:12.755135" } 44109 1727204232.81241: dumping result to json 44109 1727204232.81243: done dumping result, returning 44109 1727204232.81245: done running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 [028d2410-947f-ed67-a560-0000000001d0] 44109 1727204232.81247: sending task result for task 028d2410-947f-ed67-a560-0000000001d0 44109 1727204232.81350: done sending task result for task 028d2410-947f-ed67-a560-0000000001d0 44109 1727204232.81352: WORKER PROCESS EXITING 44109 1727204232.81600: no more pending results, returning what we have 44109 1727204232.81603: results queue empty 44109 1727204232.81604: checking for any_errors_fatal 44109 1727204232.81608: done checking for any_errors_fatal 44109 1727204232.81608: checking for max_fail_percentage 44109 1727204232.81610: done checking for max_fail_percentage 44109 1727204232.81612: checking to see if all hosts have failed and the running result is not ok 44109 1727204232.81613: done checking to see if all hosts have failed 44109 1727204232.81614: getting the remaining hosts for this loop 44109 1727204232.81615: done getting the remaining hosts for this loop 44109 1727204232.81618: getting the next task for host managed-node1 44109 1727204232.81623: done getting next task for host managed-node1 44109 1727204232.81625: ^ task is: TASK: Set up veth as managed by NetworkManager 44109 1727204232.81627: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204232.81630: getting variables 44109 1727204232.81631: in VariableManager get_vars() 44109 1727204232.81656: Calling all_inventory to load vars for managed-node1 44109 1727204232.81658: Calling groups_inventory to load vars for managed-node1 44109 1727204232.81660: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204232.81668: Calling all_plugins_play to load vars for managed-node1 44109 1727204232.81671: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204232.81673: Calling groups_plugins_play to load vars for managed-node1 44109 1727204232.81840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204232.82041: done with get_vars() 44109 1727204232.82051: done getting variables 44109 1727204232.82109: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:57:12 -0400 (0:00:01.399) 0:00:09.617 ***** 44109 1727204232.82138: entering _queue_task() for managed-node1/command 44109 1727204232.82433: worker is 1 (out of 1 available) 44109 1727204232.82445: exiting _queue_task() for managed-node1/command 44109 1727204232.82457: done queuing things up, now waiting for results queue to drain 44109 1727204232.82458: waiting for pending results... 44109 1727204232.83094: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 44109 1727204232.83100: in run() - task 028d2410-947f-ed67-a560-0000000001d1 44109 1727204232.83104: variable 'ansible_search_path' from source: unknown 44109 1727204232.83107: variable 'ansible_search_path' from source: unknown 44109 1727204232.83110: calling self._execute() 44109 1727204232.83115: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204232.83118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204232.83120: variable 'omit' from source: magic vars 44109 1727204232.83400: variable 'ansible_distribution_major_version' from source: facts 44109 1727204232.83485: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204232.83582: variable 'type' from source: set_fact 44109 1727204232.83598: variable 'state' from source: include params 44109 1727204232.83608: Evaluated conditional (type == 'veth' and state == 'present'): True 44109 1727204232.83622: variable 'omit' from source: magic vars 44109 1727204232.83679: variable 'omit' from source: magic vars 44109 1727204232.83789: variable 'interface' from source: set_fact 44109 1727204232.83819: variable 'omit' from source: magic vars 44109 1727204232.83862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204232.83899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204232.83927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204232.83945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204232.83959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204232.83994: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204232.84004: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204232.84027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204232.84126: Set connection var ansible_connection to ssh 44109 1727204232.84142: Set connection var ansible_timeout to 10 44109 1727204232.84243: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204232.84246: Set connection var ansible_pipelining to False 44109 1727204232.84248: Set connection var ansible_shell_executable to /bin/sh 44109 1727204232.84250: Set connection var ansible_shell_type to sh 44109 1727204232.84252: variable 'ansible_shell_executable' from source: unknown 44109 1727204232.84254: variable 'ansible_connection' from source: unknown 44109 1727204232.84256: variable 'ansible_module_compression' from source: unknown 44109 1727204232.84257: variable 'ansible_shell_type' from source: unknown 44109 1727204232.84259: variable 'ansible_shell_executable' from source: unknown 44109 1727204232.84260: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204232.84262: variable 'ansible_pipelining' from source: unknown 44109 1727204232.84264: variable 'ansible_timeout' from source: unknown 44109 1727204232.84265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204232.84382: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204232.84396: variable 'omit' from source: magic vars 44109 1727204232.84405: starting attempt loop 44109 1727204232.84413: running the handler 44109 1727204232.84434: _low_level_execute_command(): starting 44109 1727204232.84444: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204232.85193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.85228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204232.85296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.85350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.85369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.85397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.85521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.87314: stdout chunk (state=3): >>>/root <<< 44109 1727204232.87485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.87489: stdout chunk (state=3): >>><<< 44109 1727204232.87492: stderr chunk (state=3): >>><<< 44109 1727204232.87624: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.87628: _low_level_execute_command(): starting 44109 1727204232.87630: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128 `" && echo ansible-tmp-1727204232.8752136-45060-67255098171128="` echo /root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128 `" ) && sleep 0' 44109 1727204232.88203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.88223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.88238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204232.88266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204232.88288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204232.88301: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204232.88320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.88393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.88424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.88442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.88466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.88590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.90698: stdout chunk (state=3): >>>ansible-tmp-1727204232.8752136-45060-67255098171128=/root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128 <<< 44109 1727204232.90844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.90870: stdout chunk (state=3): >>><<< 44109 1727204232.90879: stderr chunk (state=3): >>><<< 44109 1727204232.90983: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204232.8752136-45060-67255098171128=/root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.90987: variable 'ansible_module_compression' from source: unknown 44109 1727204232.90999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204232.91040: variable 'ansible_facts' from source: unknown 44109 1727204232.91144: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/AnsiballZ_command.py 44109 1727204232.91326: Sending initial data 44109 1727204232.91334: Sent initial data (155 bytes) 44109 1727204232.91956: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.91989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.92096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.92123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.92236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.93997: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204232.94103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204232.94196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmprvr7e1cm /root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/AnsiballZ_command.py <<< 44109 1727204232.94199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/AnsiballZ_command.py" <<< 44109 1727204232.94268: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmprvr7e1cm" to remote "/root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/AnsiballZ_command.py" <<< 44109 1727204232.95227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.95346: stderr chunk (state=3): >>><<< 44109 1727204232.95350: stdout chunk (state=3): >>><<< 44109 1727204232.95353: done transferring module to remote 44109 1727204232.95355: _low_level_execute_command(): starting 44109 1727204232.95357: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/ /root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/AnsiballZ_command.py && sleep 0' 44109 1727204232.95962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.96061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204232.96107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.96110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.96212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204232.98214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204232.98399: stderr chunk (state=3): >>><<< 44109 1727204232.98402: stdout chunk (state=3): >>><<< 44109 1727204232.98405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204232.98407: _low_level_execute_command(): starting 44109 1727204232.98410: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/AnsiballZ_command.py && sleep 0' 44109 1727204232.98939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204232.98953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204232.98966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204232.98991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204232.99008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204232.99020: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204232.99034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204232.99052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204232.99065: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204232.99080: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204232.99162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204232.99189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204232.99315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204233.17858: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 14:57:13.155931", "end": "2024-09-24 14:57:13.174606", "delta": "0:00:00.018675", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204233.19665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204233.19684: stdout chunk (state=3): >>><<< 44109 1727204233.19705: stderr chunk (state=3): >>><<< 44109 1727204233.19723: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 14:57:13.155931", "end": "2024-09-24 14:57:13.174606", "delta": "0:00:00.018675", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204233.19831: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204233.19834: _low_level_execute_command(): starting 44109 1727204233.19836: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204232.8752136-45060-67255098171128/ > /dev/null 2>&1 && sleep 0' 44109 1727204233.20535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204233.20623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204233.20920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204233.20994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204233.23016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204233.23065: stderr chunk (state=3): >>><<< 44109 1727204233.23085: stdout chunk (state=3): >>><<< 44109 1727204233.23118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204233.23130: handler run complete 44109 1727204233.23159: Evaluated conditional (False): False 44109 1727204233.23174: attempt loop complete, returning result 44109 1727204233.23184: _execute() done 44109 1727204233.23190: dumping result to json 44109 1727204233.23198: done dumping result, returning 44109 1727204233.23219: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [028d2410-947f-ed67-a560-0000000001d1] 44109 1727204233.23227: sending task result for task 028d2410-947f-ed67-a560-0000000001d1 44109 1727204233.23359: done sending task result for task 028d2410-947f-ed67-a560-0000000001d1 44109 1727204233.23363: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.018675", "end": "2024-09-24 14:57:13.174606", "rc": 0, "start": "2024-09-24 14:57:13.155931" } 44109 1727204233.23454: no more pending results, returning what we have 44109 1727204233.23457: results queue empty 44109 1727204233.23458: checking for any_errors_fatal 44109 1727204233.23472: done checking for any_errors_fatal 44109 1727204233.23472: checking for max_fail_percentage 44109 1727204233.23474: done checking for max_fail_percentage 44109 1727204233.23477: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.23478: done checking to see if all hosts have failed 44109 1727204233.23479: getting the remaining hosts for this loop 44109 1727204233.23480: done getting the remaining hosts for this loop 44109 1727204233.23484: getting the next task for host managed-node1 44109 1727204233.23490: done getting next task for host managed-node1 44109 1727204233.23493: ^ task is: TASK: Delete veth interface {{ interface }} 44109 1727204233.23496: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.23501: getting variables 44109 1727204233.23503: in VariableManager get_vars() 44109 1727204233.23542: Calling all_inventory to load vars for managed-node1 44109 1727204233.23545: Calling groups_inventory to load vars for managed-node1 44109 1727204233.23548: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.23558: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.23561: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.23563: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.24034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.24279: done with get_vars() 44109 1727204233.24290: done getting variables 44109 1727204233.24353: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204233.24473: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.423) 0:00:10.041 ***** 44109 1727204233.24504: entering _queue_task() for managed-node1/command 44109 1727204233.24881: worker is 1 (out of 1 available) 44109 1727204233.24891: exiting _queue_task() for managed-node1/command 44109 1727204233.24899: done queuing things up, now waiting for results queue to drain 44109 1727204233.24900: waiting for pending results... 44109 1727204233.25199: running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 44109 1727204233.25204: in run() - task 028d2410-947f-ed67-a560-0000000001d2 44109 1727204233.25207: variable 'ansible_search_path' from source: unknown 44109 1727204233.25213: variable 'ansible_search_path' from source: unknown 44109 1727204233.25281: calling self._execute() 44109 1727204233.25334: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.25344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.25355: variable 'omit' from source: magic vars 44109 1727204233.25743: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.25751: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.25952: variable 'type' from source: set_fact 44109 1727204233.25964: variable 'state' from source: include params 44109 1727204233.25972: variable 'interface' from source: set_fact 44109 1727204233.25985: variable 'current_interfaces' from source: set_fact 44109 1727204233.26002: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 44109 1727204233.26010: when evaluation is False, skipping this task 44109 1727204233.26017: _execute() done 44109 1727204233.26026: dumping result to json 44109 1727204233.26032: done dumping result, returning 44109 1727204233.26042: done running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 [028d2410-947f-ed67-a560-0000000001d2] 44109 1727204233.26053: sending task result for task 028d2410-947f-ed67-a560-0000000001d2 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 44109 1727204233.26214: no more pending results, returning what we have 44109 1727204233.26218: results queue empty 44109 1727204233.26219: checking for any_errors_fatal 44109 1727204233.26226: done checking for any_errors_fatal 44109 1727204233.26227: checking for max_fail_percentage 44109 1727204233.26229: done checking for max_fail_percentage 44109 1727204233.26230: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.26230: done checking to see if all hosts have failed 44109 1727204233.26231: getting the remaining hosts for this loop 44109 1727204233.26232: done getting the remaining hosts for this loop 44109 1727204233.26235: getting the next task for host managed-node1 44109 1727204233.26242: done getting next task for host managed-node1 44109 1727204233.26244: ^ task is: TASK: Create dummy interface {{ interface }} 44109 1727204233.26248: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.26252: getting variables 44109 1727204233.26253: in VariableManager get_vars() 44109 1727204233.26291: Calling all_inventory to load vars for managed-node1 44109 1727204233.26293: Calling groups_inventory to load vars for managed-node1 44109 1727204233.26296: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.26310: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.26317: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.26321: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.26588: done sending task result for task 028d2410-947f-ed67-a560-0000000001d2 44109 1727204233.26592: WORKER PROCESS EXITING 44109 1727204233.26602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.26813: done with get_vars() 44109 1727204233.26836: done getting variables 44109 1727204233.26895: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204233.27021: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.025) 0:00:10.066 ***** 44109 1727204233.27054: entering _queue_task() for managed-node1/command 44109 1727204233.27389: worker is 1 (out of 1 available) 44109 1727204233.27400: exiting _queue_task() for managed-node1/command 44109 1727204233.27469: done queuing things up, now waiting for results queue to drain 44109 1727204233.27470: waiting for pending results... 44109 1727204233.27695: running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 44109 1727204233.27867: in run() - task 028d2410-947f-ed67-a560-0000000001d3 44109 1727204233.27872: variable 'ansible_search_path' from source: unknown 44109 1727204233.27877: variable 'ansible_search_path' from source: unknown 44109 1727204233.27959: calling self._execute() 44109 1727204233.28016: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.28027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.28039: variable 'omit' from source: magic vars 44109 1727204233.28455: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.28501: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.28993: variable 'type' from source: set_fact 44109 1727204233.28997: variable 'state' from source: include params 44109 1727204233.28999: variable 'interface' from source: set_fact 44109 1727204233.29000: variable 'current_interfaces' from source: set_fact 44109 1727204233.29003: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 44109 1727204233.29005: when evaluation is False, skipping this task 44109 1727204233.29006: _execute() done 44109 1727204233.29008: dumping result to json 44109 1727204233.29009: done dumping result, returning 44109 1727204233.29013: done running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 [028d2410-947f-ed67-a560-0000000001d3] 44109 1727204233.29015: sending task result for task 028d2410-947f-ed67-a560-0000000001d3 44109 1727204233.29071: done sending task result for task 028d2410-947f-ed67-a560-0000000001d3 44109 1727204233.29074: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 44109 1727204233.29140: no more pending results, returning what we have 44109 1727204233.29144: results queue empty 44109 1727204233.29145: checking for any_errors_fatal 44109 1727204233.29151: done checking for any_errors_fatal 44109 1727204233.29152: checking for max_fail_percentage 44109 1727204233.29153: done checking for max_fail_percentage 44109 1727204233.29155: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.29155: done checking to see if all hosts have failed 44109 1727204233.29156: getting the remaining hosts for this loop 44109 1727204233.29157: done getting the remaining hosts for this loop 44109 1727204233.29160: getting the next task for host managed-node1 44109 1727204233.29166: done getting next task for host managed-node1 44109 1727204233.29169: ^ task is: TASK: Delete dummy interface {{ interface }} 44109 1727204233.29172: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.29182: getting variables 44109 1727204233.29183: in VariableManager get_vars() 44109 1727204233.29220: Calling all_inventory to load vars for managed-node1 44109 1727204233.29222: Calling groups_inventory to load vars for managed-node1 44109 1727204233.29224: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.29235: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.29238: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.29240: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.30390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.30590: done with get_vars() 44109 1727204233.30599: done getting variables 44109 1727204233.30658: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204233.30768: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.037) 0:00:10.104 ***** 44109 1727204233.30796: entering _queue_task() for managed-node1/command 44109 1727204233.31172: worker is 1 (out of 1 available) 44109 1727204233.31182: exiting _queue_task() for managed-node1/command 44109 1727204233.31192: done queuing things up, now waiting for results queue to drain 44109 1727204233.31193: waiting for pending results... 44109 1727204233.31430: running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 44109 1727204233.31583: in run() - task 028d2410-947f-ed67-a560-0000000001d4 44109 1727204233.31588: variable 'ansible_search_path' from source: unknown 44109 1727204233.31590: variable 'ansible_search_path' from source: unknown 44109 1727204233.31593: calling self._execute() 44109 1727204233.31632: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.31643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.31655: variable 'omit' from source: magic vars 44109 1727204233.32036: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.32052: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.32274: variable 'type' from source: set_fact 44109 1727204233.32287: variable 'state' from source: include params 44109 1727204233.32295: variable 'interface' from source: set_fact 44109 1727204233.32303: variable 'current_interfaces' from source: set_fact 44109 1727204233.32318: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 44109 1727204233.32344: when evaluation is False, skipping this task 44109 1727204233.32348: _execute() done 44109 1727204233.32350: dumping result to json 44109 1727204233.32355: done dumping result, returning 44109 1727204233.32359: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 [028d2410-947f-ed67-a560-0000000001d4] 44109 1727204233.32382: sending task result for task 028d2410-947f-ed67-a560-0000000001d4 44109 1727204233.32582: done sending task result for task 028d2410-947f-ed67-a560-0000000001d4 44109 1727204233.32586: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 44109 1727204233.32638: no more pending results, returning what we have 44109 1727204233.32642: results queue empty 44109 1727204233.32643: checking for any_errors_fatal 44109 1727204233.32650: done checking for any_errors_fatal 44109 1727204233.32651: checking for max_fail_percentage 44109 1727204233.32652: done checking for max_fail_percentage 44109 1727204233.32654: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.32655: done checking to see if all hosts have failed 44109 1727204233.32655: getting the remaining hosts for this loop 44109 1727204233.32657: done getting the remaining hosts for this loop 44109 1727204233.32660: getting the next task for host managed-node1 44109 1727204233.32672: done getting next task for host managed-node1 44109 1727204233.32676: ^ task is: TASK: Create tap interface {{ interface }} 44109 1727204233.32680: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.32684: getting variables 44109 1727204233.32686: in VariableManager get_vars() 44109 1727204233.32728: Calling all_inventory to load vars for managed-node1 44109 1727204233.32731: Calling groups_inventory to load vars for managed-node1 44109 1727204233.32733: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.32747: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.32749: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.32752: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.33056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.33272: done with get_vars() 44109 1727204233.33285: done getting variables 44109 1727204233.33353: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204233.33498: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.027) 0:00:10.131 ***** 44109 1727204233.33531: entering _queue_task() for managed-node1/command 44109 1727204233.33916: worker is 1 (out of 1 available) 44109 1727204233.33927: exiting _queue_task() for managed-node1/command 44109 1727204233.33937: done queuing things up, now waiting for results queue to drain 44109 1727204233.33938: waiting for pending results... 44109 1727204233.34180: running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 44109 1727204233.34236: in run() - task 028d2410-947f-ed67-a560-0000000001d5 44109 1727204233.34258: variable 'ansible_search_path' from source: unknown 44109 1727204233.34272: variable 'ansible_search_path' from source: unknown 44109 1727204233.34321: calling self._execute() 44109 1727204233.34417: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.34480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.34489: variable 'omit' from source: magic vars 44109 1727204233.34827: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.34844: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.35059: variable 'type' from source: set_fact 44109 1727204233.35070: variable 'state' from source: include params 44109 1727204233.35081: variable 'interface' from source: set_fact 44109 1727204233.35091: variable 'current_interfaces' from source: set_fact 44109 1727204233.35102: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 44109 1727204233.35114: when evaluation is False, skipping this task 44109 1727204233.35147: _execute() done 44109 1727204233.35150: dumping result to json 44109 1727204233.35153: done dumping result, returning 44109 1727204233.35156: done running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 [028d2410-947f-ed67-a560-0000000001d5] 44109 1727204233.35158: sending task result for task 028d2410-947f-ed67-a560-0000000001d5 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 44109 1727204233.35300: no more pending results, returning what we have 44109 1727204233.35305: results queue empty 44109 1727204233.35306: checking for any_errors_fatal 44109 1727204233.35314: done checking for any_errors_fatal 44109 1727204233.35315: checking for max_fail_percentage 44109 1727204233.35317: done checking for max_fail_percentage 44109 1727204233.35318: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.35319: done checking to see if all hosts have failed 44109 1727204233.35320: getting the remaining hosts for this loop 44109 1727204233.35321: done getting the remaining hosts for this loop 44109 1727204233.35325: getting the next task for host managed-node1 44109 1727204233.35332: done getting next task for host managed-node1 44109 1727204233.35335: ^ task is: TASK: Delete tap interface {{ interface }} 44109 1727204233.35338: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.35343: getting variables 44109 1727204233.35345: in VariableManager get_vars() 44109 1727204233.35389: Calling all_inventory to load vars for managed-node1 44109 1727204233.35392: Calling groups_inventory to load vars for managed-node1 44109 1727204233.35395: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.35407: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.35410: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.35417: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.35896: done sending task result for task 028d2410-947f-ed67-a560-0000000001d5 44109 1727204233.35900: WORKER PROCESS EXITING 44109 1727204233.35929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.36129: done with get_vars() 44109 1727204233.36138: done getting variables 44109 1727204233.36197: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204233.36310: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.028) 0:00:10.159 ***** 44109 1727204233.36347: entering _queue_task() for managed-node1/command 44109 1727204233.36616: worker is 1 (out of 1 available) 44109 1727204233.36627: exiting _queue_task() for managed-node1/command 44109 1727204233.36637: done queuing things up, now waiting for results queue to drain 44109 1727204233.36638: waiting for pending results... 44109 1727204233.36875: running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 44109 1727204233.36972: in run() - task 028d2410-947f-ed67-a560-0000000001d6 44109 1727204233.37004: variable 'ansible_search_path' from source: unknown 44109 1727204233.37013: variable 'ansible_search_path' from source: unknown 44109 1727204233.37051: calling self._execute() 44109 1727204233.37137: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.37146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.37215: variable 'omit' from source: magic vars 44109 1727204233.37540: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.37560: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.37784: variable 'type' from source: set_fact 44109 1727204233.37799: variable 'state' from source: include params 44109 1727204233.37809: variable 'interface' from source: set_fact 44109 1727204233.37822: variable 'current_interfaces' from source: set_fact 44109 1727204233.37834: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 44109 1727204233.37867: when evaluation is False, skipping this task 44109 1727204233.37870: _execute() done 44109 1727204233.37873: dumping result to json 44109 1727204233.37876: done dumping result, returning 44109 1727204233.37882: done running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 [028d2410-947f-ed67-a560-0000000001d6] 44109 1727204233.37891: sending task result for task 028d2410-947f-ed67-a560-0000000001d6 44109 1727204233.38039: done sending task result for task 028d2410-947f-ed67-a560-0000000001d6 44109 1727204233.38042: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 44109 1727204233.38126: no more pending results, returning what we have 44109 1727204233.38130: results queue empty 44109 1727204233.38131: checking for any_errors_fatal 44109 1727204233.38136: done checking for any_errors_fatal 44109 1727204233.38136: checking for max_fail_percentage 44109 1727204233.38138: done checking for max_fail_percentage 44109 1727204233.38140: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.38141: done checking to see if all hosts have failed 44109 1727204233.38141: getting the remaining hosts for this loop 44109 1727204233.38142: done getting the remaining hosts for this loop 44109 1727204233.38146: getting the next task for host managed-node1 44109 1727204233.38155: done getting next task for host managed-node1 44109 1727204233.38158: ^ task is: TASK: Include the task 'assert_device_present.yml' 44109 1727204233.38161: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.38166: getting variables 44109 1727204233.38168: in VariableManager get_vars() 44109 1727204233.38206: Calling all_inventory to load vars for managed-node1 44109 1727204233.38209: Calling groups_inventory to load vars for managed-node1 44109 1727204233.38215: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.38227: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.38231: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.38234: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.38558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.38764: done with get_vars() 44109 1727204233.38773: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:20 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.025) 0:00:10.184 ***** 44109 1727204233.38868: entering _queue_task() for managed-node1/include_tasks 44109 1727204233.39113: worker is 1 (out of 1 available) 44109 1727204233.39125: exiting _queue_task() for managed-node1/include_tasks 44109 1727204233.39249: done queuing things up, now waiting for results queue to drain 44109 1727204233.39250: waiting for pending results... 44109 1727204233.39479: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' 44109 1727204233.39506: in run() - task 028d2410-947f-ed67-a560-00000000000e 44109 1727204233.39529: variable 'ansible_search_path' from source: unknown 44109 1727204233.39592: calling self._execute() 44109 1727204233.39662: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.39673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.39692: variable 'omit' from source: magic vars 44109 1727204233.40117: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.40120: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.40123: _execute() done 44109 1727204233.40126: dumping result to json 44109 1727204233.40129: done dumping result, returning 44109 1727204233.40131: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' [028d2410-947f-ed67-a560-00000000000e] 44109 1727204233.40133: sending task result for task 028d2410-947f-ed67-a560-00000000000e 44109 1727204233.40338: done sending task result for task 028d2410-947f-ed67-a560-00000000000e 44109 1727204233.40342: WORKER PROCESS EXITING 44109 1727204233.40368: no more pending results, returning what we have 44109 1727204233.40372: in VariableManager get_vars() 44109 1727204233.40417: Calling all_inventory to load vars for managed-node1 44109 1727204233.40420: Calling groups_inventory to load vars for managed-node1 44109 1727204233.40423: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.40541: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.40550: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.40554: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.40787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.40989: done with get_vars() 44109 1727204233.40996: variable 'ansible_search_path' from source: unknown 44109 1727204233.41008: we have included files to process 44109 1727204233.41009: generating all_blocks data 44109 1727204233.41010: done generating all_blocks data 44109 1727204233.41017: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44109 1727204233.41018: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44109 1727204233.41020: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44109 1727204233.41174: in VariableManager get_vars() 44109 1727204233.41198: done with get_vars() 44109 1727204233.41308: done processing included file 44109 1727204233.41310: iterating over new_blocks loaded from include file 44109 1727204233.41314: in VariableManager get_vars() 44109 1727204233.41328: done with get_vars() 44109 1727204233.41329: filtering new block on tags 44109 1727204233.41346: done filtering new block on tags 44109 1727204233.41348: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 44109 1727204233.41353: extending task lists for all hosts with included blocks 44109 1727204233.43096: done extending task lists 44109 1727204233.43097: done processing included files 44109 1727204233.43098: results queue empty 44109 1727204233.43099: checking for any_errors_fatal 44109 1727204233.43101: done checking for any_errors_fatal 44109 1727204233.43102: checking for max_fail_percentage 44109 1727204233.43104: done checking for max_fail_percentage 44109 1727204233.43105: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.43106: done checking to see if all hosts have failed 44109 1727204233.43106: getting the remaining hosts for this loop 44109 1727204233.43108: done getting the remaining hosts for this loop 44109 1727204233.43110: getting the next task for host managed-node1 44109 1727204233.43116: done getting next task for host managed-node1 44109 1727204233.43118: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44109 1727204233.43121: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.43123: getting variables 44109 1727204233.43124: in VariableManager get_vars() 44109 1727204233.43135: Calling all_inventory to load vars for managed-node1 44109 1727204233.43137: Calling groups_inventory to load vars for managed-node1 44109 1727204233.43139: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.43149: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.43151: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.43155: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.43302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.43503: done with get_vars() 44109 1727204233.43515: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.047) 0:00:10.232 ***** 44109 1727204233.43588: entering _queue_task() for managed-node1/include_tasks 44109 1727204233.43861: worker is 1 (out of 1 available) 44109 1727204233.43873: exiting _queue_task() for managed-node1/include_tasks 44109 1727204233.43987: done queuing things up, now waiting for results queue to drain 44109 1727204233.43989: waiting for pending results... 44109 1727204233.44165: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 44109 1727204233.44285: in run() - task 028d2410-947f-ed67-a560-0000000002ec 44109 1727204233.44305: variable 'ansible_search_path' from source: unknown 44109 1727204233.44315: variable 'ansible_search_path' from source: unknown 44109 1727204233.44382: calling self._execute() 44109 1727204233.44456: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.44559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.44562: variable 'omit' from source: magic vars 44109 1727204233.44860: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.44878: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.44897: _execute() done 44109 1727204233.44906: dumping result to json 44109 1727204233.44996: done dumping result, returning 44109 1727204233.44999: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-ed67-a560-0000000002ec] 44109 1727204233.45002: sending task result for task 028d2410-947f-ed67-a560-0000000002ec 44109 1727204233.45068: done sending task result for task 028d2410-947f-ed67-a560-0000000002ec 44109 1727204233.45071: WORKER PROCESS EXITING 44109 1727204233.45125: no more pending results, returning what we have 44109 1727204233.45131: in VariableManager get_vars() 44109 1727204233.45171: Calling all_inventory to load vars for managed-node1 44109 1727204233.45173: Calling groups_inventory to load vars for managed-node1 44109 1727204233.45178: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.45190: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.45192: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.45195: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.45596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.45790: done with get_vars() 44109 1727204233.45797: variable 'ansible_search_path' from source: unknown 44109 1727204233.45798: variable 'ansible_search_path' from source: unknown 44109 1727204233.45884: we have included files to process 44109 1727204233.45886: generating all_blocks data 44109 1727204233.45887: done generating all_blocks data 44109 1727204233.45889: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44109 1727204233.45890: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44109 1727204233.45892: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44109 1727204233.46328: done processing included file 44109 1727204233.46330: iterating over new_blocks loaded from include file 44109 1727204233.46331: in VariableManager get_vars() 44109 1727204233.46346: done with get_vars() 44109 1727204233.46348: filtering new block on tags 44109 1727204233.46434: done filtering new block on tags 44109 1727204233.46437: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 44109 1727204233.46442: extending task lists for all hosts with included blocks 44109 1727204233.46754: done extending task lists 44109 1727204233.46756: done processing included files 44109 1727204233.46756: results queue empty 44109 1727204233.46757: checking for any_errors_fatal 44109 1727204233.46760: done checking for any_errors_fatal 44109 1727204233.46760: checking for max_fail_percentage 44109 1727204233.46762: done checking for max_fail_percentage 44109 1727204233.46762: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.46763: done checking to see if all hosts have failed 44109 1727204233.46764: getting the remaining hosts for this loop 44109 1727204233.46765: done getting the remaining hosts for this loop 44109 1727204233.46767: getting the next task for host managed-node1 44109 1727204233.46771: done getting next task for host managed-node1 44109 1727204233.46773: ^ task is: TASK: Get stat for interface {{ interface }} 44109 1727204233.46779: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.46782: getting variables 44109 1727204233.46782: in VariableManager get_vars() 44109 1727204233.46851: Calling all_inventory to load vars for managed-node1 44109 1727204233.46854: Calling groups_inventory to load vars for managed-node1 44109 1727204233.46857: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.46861: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.46864: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.46866: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.47094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.47496: done with get_vars() 44109 1727204233.47505: done getting variables 44109 1727204233.47747: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.041) 0:00:10.274 ***** 44109 1727204233.47856: entering _queue_task() for managed-node1/stat 44109 1727204233.48208: worker is 1 (out of 1 available) 44109 1727204233.48221: exiting _queue_task() for managed-node1/stat 44109 1727204233.48242: done queuing things up, now waiting for results queue to drain 44109 1727204233.48244: waiting for pending results... 44109 1727204233.48514: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 44109 1727204233.48590: in run() - task 028d2410-947f-ed67-a560-0000000003b5 44109 1727204233.48619: variable 'ansible_search_path' from source: unknown 44109 1727204233.48683: variable 'ansible_search_path' from source: unknown 44109 1727204233.48687: calling self._execute() 44109 1727204233.48763: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.48777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.48793: variable 'omit' from source: magic vars 44109 1727204233.49191: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.49209: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.49220: variable 'omit' from source: magic vars 44109 1727204233.49274: variable 'omit' from source: magic vars 44109 1727204233.49382: variable 'interface' from source: set_fact 44109 1727204233.49580: variable 'omit' from source: magic vars 44109 1727204233.49584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204233.49587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204233.49589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204233.49591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204233.49593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204233.49595: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204233.49597: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.49600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.49694: Set connection var ansible_connection to ssh 44109 1727204233.49706: Set connection var ansible_timeout to 10 44109 1727204233.49728: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204233.49741: Set connection var ansible_pipelining to False 44109 1727204233.49751: Set connection var ansible_shell_executable to /bin/sh 44109 1727204233.49796: Set connection var ansible_shell_type to sh 44109 1727204233.49829: variable 'ansible_shell_executable' from source: unknown 44109 1727204233.49934: variable 'ansible_connection' from source: unknown 44109 1727204233.49938: variable 'ansible_module_compression' from source: unknown 44109 1727204233.49940: variable 'ansible_shell_type' from source: unknown 44109 1727204233.49942: variable 'ansible_shell_executable' from source: unknown 44109 1727204233.49946: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.49948: variable 'ansible_pipelining' from source: unknown 44109 1727204233.49950: variable 'ansible_timeout' from source: unknown 44109 1727204233.49952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.50360: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204233.50383: variable 'omit' from source: magic vars 44109 1727204233.50392: starting attempt loop 44109 1727204233.50398: running the handler 44109 1727204233.50416: _low_level_execute_command(): starting 44109 1727204233.50429: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204233.51201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204233.51294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204233.51347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204233.51431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204233.53226: stdout chunk (state=3): >>>/root <<< 44109 1727204233.53340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204233.53344: stdout chunk (state=3): >>><<< 44109 1727204233.53347: stderr chunk (state=3): >>><<< 44109 1727204233.53379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204233.53402: _low_level_execute_command(): starting 44109 1727204233.53484: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994 `" && echo ansible-tmp-1727204233.5338714-45152-39890178190994="` echo /root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994 `" ) && sleep 0' 44109 1727204233.54149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204233.54163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204233.54191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204233.54288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204233.54318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204233.54427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204233.56538: stdout chunk (state=3): >>>ansible-tmp-1727204233.5338714-45152-39890178190994=/root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994 <<< 44109 1727204233.56646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204233.56655: stdout chunk (state=3): >>><<< 44109 1727204233.56897: stderr chunk (state=3): >>><<< 44109 1727204233.56927: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204233.5338714-45152-39890178190994=/root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204233.56983: variable 'ansible_module_compression' from source: unknown 44109 1727204233.57055: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44109 1727204233.57098: variable 'ansible_facts' from source: unknown 44109 1727204233.57209: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/AnsiballZ_stat.py 44109 1727204233.57462: Sending initial data 44109 1727204233.57465: Sent initial data (152 bytes) 44109 1727204233.58091: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204233.58128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204233.58150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204233.58164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204233.58273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204233.60019: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204233.60103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204233.60183: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpni5cd3w7 /root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/AnsiballZ_stat.py <<< 44109 1727204233.60196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/AnsiballZ_stat.py" <<< 44109 1727204233.60265: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 44109 1727204233.60288: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpni5cd3w7" to remote "/root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/AnsiballZ_stat.py" <<< 44109 1727204233.61520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204233.61523: stdout chunk (state=3): >>><<< 44109 1727204233.61526: stderr chunk (state=3): >>><<< 44109 1727204233.61665: done transferring module to remote 44109 1727204233.61668: _low_level_execute_command(): starting 44109 1727204233.61670: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/ /root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/AnsiballZ_stat.py && sleep 0' 44109 1727204233.62704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204233.62708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204233.62710: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204233.62713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204233.62715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204233.62758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204233.62786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204233.62995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204233.63089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204233.65092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204233.65214: stderr chunk (state=3): >>><<< 44109 1727204233.65217: stdout chunk (state=3): >>><<< 44109 1727204233.65225: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204233.65228: _low_level_execute_command(): starting 44109 1727204233.65231: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/AnsiballZ_stat.py && sleep 0' 44109 1727204233.65759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204233.65773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204233.65791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204233.65809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204233.65829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204233.65841: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204233.65940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204233.65958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204233.65973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204233.66097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204233.82609: stdout chunk (state=3): >>> <<< 44109 1727204233.82633: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31020, "dev": 23, "nlink": 1, "atime": 1727204231.8174593, "mtime": 1727204231.8174593, "ctime": 1727204231.8174593, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44109 1727204233.84103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204233.84142: stderr chunk (state=3): >>><<< 44109 1727204233.84146: stdout chunk (state=3): >>><<< 44109 1727204233.84163: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31020, "dev": 23, "nlink": 1, "atime": 1727204231.8174593, "mtime": 1727204231.8174593, "ctime": 1727204231.8174593, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204233.84206: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204233.84217: _low_level_execute_command(): starting 44109 1727204233.84221: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204233.5338714-45152-39890178190994/ > /dev/null 2>&1 && sleep 0' 44109 1727204233.84884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204233.85207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204233.85312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204233.85385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204233.87549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204233.87552: stdout chunk (state=3): >>><<< 44109 1727204233.87555: stderr chunk (state=3): >>><<< 44109 1727204233.87557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204233.87559: handler run complete 44109 1727204233.87561: attempt loop complete, returning result 44109 1727204233.87563: _execute() done 44109 1727204233.87565: dumping result to json 44109 1727204233.87579: done dumping result, returning 44109 1727204233.87582: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [028d2410-947f-ed67-a560-0000000003b5] 44109 1727204233.87584: sending task result for task 028d2410-947f-ed67-a560-0000000003b5 44109 1727204233.87859: done sending task result for task 028d2410-947f-ed67-a560-0000000003b5 44109 1727204233.87863: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204231.8174593, "block_size": 4096, "blocks": 0, "ctime": 1727204231.8174593, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31020, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204231.8174593, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 44109 1727204233.87974: no more pending results, returning what we have 44109 1727204233.87980: results queue empty 44109 1727204233.87981: checking for any_errors_fatal 44109 1727204233.87983: done checking for any_errors_fatal 44109 1727204233.87984: checking for max_fail_percentage 44109 1727204233.87985: done checking for max_fail_percentage 44109 1727204233.87986: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.87987: done checking to see if all hosts have failed 44109 1727204233.87988: getting the remaining hosts for this loop 44109 1727204233.87989: done getting the remaining hosts for this loop 44109 1727204233.87993: getting the next task for host managed-node1 44109 1727204233.88002: done getting next task for host managed-node1 44109 1727204233.88005: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 44109 1727204233.88008: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.88013: getting variables 44109 1727204233.88015: in VariableManager get_vars() 44109 1727204233.88054: Calling all_inventory to load vars for managed-node1 44109 1727204233.88058: Calling groups_inventory to load vars for managed-node1 44109 1727204233.88060: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.88071: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.88074: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.88786: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.89404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.89804: done with get_vars() 44109 1727204233.89822: done getting variables 44109 1727204233.90202: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 44109 1727204233.90447: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.428) 0:00:10.702 ***** 44109 1727204233.90600: entering _queue_task() for managed-node1/assert 44109 1727204233.90601: Creating lock for assert 44109 1727204233.91146: worker is 1 (out of 1 available) 44109 1727204233.91159: exiting _queue_task() for managed-node1/assert 44109 1727204233.91219: done queuing things up, now waiting for results queue to drain 44109 1727204233.91221: waiting for pending results... 44109 1727204233.91726: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' 44109 1727204233.91733: in run() - task 028d2410-947f-ed67-a560-0000000002ed 44109 1727204233.91737: variable 'ansible_search_path' from source: unknown 44109 1727204233.91740: variable 'ansible_search_path' from source: unknown 44109 1727204233.91793: calling self._execute() 44109 1727204233.92086: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.92090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.92093: variable 'omit' from source: magic vars 44109 1727204233.92980: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.92985: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.92988: variable 'omit' from source: magic vars 44109 1727204233.93023: variable 'omit' from source: magic vars 44109 1727204233.93130: variable 'interface' from source: set_fact 44109 1727204233.93153: variable 'omit' from source: magic vars 44109 1727204233.93207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204233.93247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204233.93271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204233.93306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204233.93324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204233.93412: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204233.93416: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.93418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.93490: Set connection var ansible_connection to ssh 44109 1727204233.93502: Set connection var ansible_timeout to 10 44109 1727204233.93522: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204233.93537: Set connection var ansible_pipelining to False 44109 1727204233.93548: Set connection var ansible_shell_executable to /bin/sh 44109 1727204233.93557: Set connection var ansible_shell_type to sh 44109 1727204233.93585: variable 'ansible_shell_executable' from source: unknown 44109 1727204233.93594: variable 'ansible_connection' from source: unknown 44109 1727204233.93629: variable 'ansible_module_compression' from source: unknown 44109 1727204233.93632: variable 'ansible_shell_type' from source: unknown 44109 1727204233.93635: variable 'ansible_shell_executable' from source: unknown 44109 1727204233.93637: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.93639: variable 'ansible_pipelining' from source: unknown 44109 1727204233.93642: variable 'ansible_timeout' from source: unknown 44109 1727204233.93644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.93849: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204233.93852: variable 'omit' from source: magic vars 44109 1727204233.93855: starting attempt loop 44109 1727204233.93857: running the handler 44109 1727204233.93968: variable 'interface_stat' from source: set_fact 44109 1727204233.93993: Evaluated conditional (interface_stat.stat.exists): True 44109 1727204233.94004: handler run complete 44109 1727204233.94023: attempt loop complete, returning result 44109 1727204233.94031: _execute() done 44109 1727204233.94038: dumping result to json 44109 1727204233.94045: done dumping result, returning 44109 1727204233.94063: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' [028d2410-947f-ed67-a560-0000000002ed] 44109 1727204233.94174: sending task result for task 028d2410-947f-ed67-a560-0000000002ed 44109 1727204233.94244: done sending task result for task 028d2410-947f-ed67-a560-0000000002ed 44109 1727204233.94248: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204233.94308: no more pending results, returning what we have 44109 1727204233.94312: results queue empty 44109 1727204233.94313: checking for any_errors_fatal 44109 1727204233.94322: done checking for any_errors_fatal 44109 1727204233.94323: checking for max_fail_percentage 44109 1727204233.94325: done checking for max_fail_percentage 44109 1727204233.94325: checking to see if all hosts have failed and the running result is not ok 44109 1727204233.94326: done checking to see if all hosts have failed 44109 1727204233.94327: getting the remaining hosts for this loop 44109 1727204233.94328: done getting the remaining hosts for this loop 44109 1727204233.94332: getting the next task for host managed-node1 44109 1727204233.94340: done getting next task for host managed-node1 44109 1727204233.94342: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 44109 1727204233.94344: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204233.94348: getting variables 44109 1727204233.94350: in VariableManager get_vars() 44109 1727204233.94584: Calling all_inventory to load vars for managed-node1 44109 1727204233.94588: Calling groups_inventory to load vars for managed-node1 44109 1727204233.94590: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204233.94599: Calling all_plugins_play to load vars for managed-node1 44109 1727204233.94603: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204233.94606: Calling groups_plugins_play to load vars for managed-node1 44109 1727204233.94878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204233.95241: done with get_vars() 44109 1727204233.95251: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:23 Tuesday 24 September 2024 14:57:13 -0400 (0:00:00.048) 0:00:10.750 ***** 44109 1727204233.95457: entering _queue_task() for managed-node1/lineinfile 44109 1727204233.95459: Creating lock for lineinfile 44109 1727204233.95964: worker is 1 (out of 1 available) 44109 1727204233.96079: exiting _queue_task() for managed-node1/lineinfile 44109 1727204233.96093: done queuing things up, now waiting for results queue to drain 44109 1727204233.96094: waiting for pending results... 44109 1727204233.96393: running TaskExecutor() for managed-node1/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 44109 1727204233.96434: in run() - task 028d2410-947f-ed67-a560-00000000000f 44109 1727204233.96440: variable 'ansible_search_path' from source: unknown 44109 1727204233.96464: calling self._execute() 44109 1727204233.96649: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.96655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.96658: variable 'omit' from source: magic vars 44109 1727204233.96915: variable 'ansible_distribution_major_version' from source: facts 44109 1727204233.96931: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204233.96936: variable 'omit' from source: magic vars 44109 1727204233.96978: variable 'omit' from source: magic vars 44109 1727204233.96989: variable 'omit' from source: magic vars 44109 1727204233.97032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204233.97064: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204233.97086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204233.97102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204233.97118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204233.97147: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204233.97150: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.97152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.97254: Set connection var ansible_connection to ssh 44109 1727204233.97261: Set connection var ansible_timeout to 10 44109 1727204233.97267: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204233.97276: Set connection var ansible_pipelining to False 44109 1727204233.97302: Set connection var ansible_shell_executable to /bin/sh 44109 1727204233.97306: Set connection var ansible_shell_type to sh 44109 1727204233.97311: variable 'ansible_shell_executable' from source: unknown 44109 1727204233.97316: variable 'ansible_connection' from source: unknown 44109 1727204233.97319: variable 'ansible_module_compression' from source: unknown 44109 1727204233.97321: variable 'ansible_shell_type' from source: unknown 44109 1727204233.97324: variable 'ansible_shell_executable' from source: unknown 44109 1727204233.97326: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204233.97328: variable 'ansible_pipelining' from source: unknown 44109 1727204233.97330: variable 'ansible_timeout' from source: unknown 44109 1727204233.97333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204233.97582: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204233.97587: variable 'omit' from source: magic vars 44109 1727204233.97590: starting attempt loop 44109 1727204233.97592: running the handler 44109 1727204233.97594: _low_level_execute_command(): starting 44109 1727204233.97596: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204233.98574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204233.98733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204233.98772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204233.98939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204234.00728: stdout chunk (state=3): >>>/root <<< 44109 1727204234.00830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204234.00884: stderr chunk (state=3): >>><<< 44109 1727204234.00899: stdout chunk (state=3): >>><<< 44109 1727204234.00938: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204234.00959: _low_level_execute_command(): starting 44109 1727204234.00972: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186 `" && echo ansible-tmp-1727204234.0094485-45248-280764916138186="` echo /root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186 `" ) && sleep 0' 44109 1727204234.01658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204234.01672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204234.01689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204234.01708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204234.01744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204234.01767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204234.01789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204234.01856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204234.01898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204234.01922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204234.02110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204234.04197: stdout chunk (state=3): >>>ansible-tmp-1727204234.0094485-45248-280764916138186=/root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186 <<< 44109 1727204234.04361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204234.04365: stdout chunk (state=3): >>><<< 44109 1727204234.04367: stderr chunk (state=3): >>><<< 44109 1727204234.04395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204234.0094485-45248-280764916138186=/root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204234.04446: variable 'ansible_module_compression' from source: unknown 44109 1727204234.04581: ANSIBALLZ: Using lock for lineinfile 44109 1727204234.04586: ANSIBALLZ: Acquiring lock 44109 1727204234.04589: ANSIBALLZ: Lock acquired: 139907464900064 44109 1727204234.04591: ANSIBALLZ: Creating module 44109 1727204234.17955: ANSIBALLZ: Writing module into payload 44109 1727204234.18086: ANSIBALLZ: Writing module 44109 1727204234.18117: ANSIBALLZ: Renaming module 44109 1727204234.18129: ANSIBALLZ: Done creating module 44109 1727204234.18151: variable 'ansible_facts' from source: unknown 44109 1727204234.18279: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/AnsiballZ_lineinfile.py 44109 1727204234.18404: Sending initial data 44109 1727204234.18413: Sent initial data (159 bytes) 44109 1727204234.19071: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204234.19190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204234.19233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204234.19317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204234.21089: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204234.21184: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204234.21264: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpql19uy5k /root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/AnsiballZ_lineinfile.py <<< 44109 1727204234.21268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/AnsiballZ_lineinfile.py" <<< 44109 1727204234.21379: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpql19uy5k" to remote "/root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/AnsiballZ_lineinfile.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/AnsiballZ_lineinfile.py" <<< 44109 1727204234.22322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204234.22341: stderr chunk (state=3): >>><<< 44109 1727204234.22451: stdout chunk (state=3): >>><<< 44109 1727204234.22454: done transferring module to remote 44109 1727204234.22456: _low_level_execute_command(): starting 44109 1727204234.22459: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/ /root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/AnsiballZ_lineinfile.py && sleep 0' 44109 1727204234.23084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204234.23106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204234.23130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204234.23199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204234.23262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204234.23283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204234.23312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204234.23433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204234.25550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204234.25554: stdout chunk (state=3): >>><<< 44109 1727204234.25556: stderr chunk (state=3): >>><<< 44109 1727204234.25771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204234.25777: _low_level_execute_command(): starting 44109 1727204234.25779: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/AnsiballZ_lineinfile.py && sleep 0' 44109 1727204234.26850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204234.27201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204234.27417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204234.27509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204234.45147: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 44109 1727204234.46780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204234.46783: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 44109 1727204234.46786: stderr chunk (state=3): >>><<< 44109 1727204234.46997: stdout chunk (state=3): >>><<< 44109 1727204234.47020: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204234.47210: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204234.47216: _low_level_execute_command(): starting 44109 1727204234.47218: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204234.0094485-45248-280764916138186/ > /dev/null 2>&1 && sleep 0' 44109 1727204234.48104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204234.48294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204234.48491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204234.48793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204234.48962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204234.51003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204234.51007: stdout chunk (state=3): >>><<< 44109 1727204234.51015: stderr chunk (state=3): >>><<< 44109 1727204234.51029: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204234.51036: handler run complete 44109 1727204234.51080: attempt loop complete, returning result 44109 1727204234.51086: _execute() done 44109 1727204234.51089: dumping result to json 44109 1727204234.51091: done dumping result, returning 44109 1727204234.51093: done running TaskExecutor() for managed-node1/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [028d2410-947f-ed67-a560-00000000000f] 44109 1727204234.51095: sending task result for task 028d2410-947f-ed67-a560-00000000000f changed: [managed-node1] => { "backup": "", "changed": true } MSG: line added 44109 1727204234.51489: no more pending results, returning what we have 44109 1727204234.51494: results queue empty 44109 1727204234.51495: checking for any_errors_fatal 44109 1727204234.51500: done checking for any_errors_fatal 44109 1727204234.51500: checking for max_fail_percentage 44109 1727204234.51502: done checking for max_fail_percentage 44109 1727204234.51503: checking to see if all hosts have failed and the running result is not ok 44109 1727204234.51504: done checking to see if all hosts have failed 44109 1727204234.51505: getting the remaining hosts for this loop 44109 1727204234.51507: done getting the remaining hosts for this loop 44109 1727204234.51581: getting the next task for host managed-node1 44109 1727204234.51588: done getting next task for host managed-node1 44109 1727204234.51593: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44109 1727204234.51596: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204234.51609: getting variables 44109 1727204234.51614: in VariableManager get_vars() 44109 1727204234.51765: Calling all_inventory to load vars for managed-node1 44109 1727204234.51768: Calling groups_inventory to load vars for managed-node1 44109 1727204234.51770: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204234.51784: Calling all_plugins_play to load vars for managed-node1 44109 1727204234.51787: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204234.51793: done sending task result for task 028d2410-947f-ed67-a560-00000000000f 44109 1727204234.51796: WORKER PROCESS EXITING 44109 1727204234.51800: Calling groups_plugins_play to load vars for managed-node1 44109 1727204234.52154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204234.52745: done with get_vars() 44109 1727204234.52756: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:14 -0400 (0:00:00.575) 0:00:11.325 ***** 44109 1727204234.52968: entering _queue_task() for managed-node1/include_tasks 44109 1727204234.53641: worker is 1 (out of 1 available) 44109 1727204234.53653: exiting _queue_task() for managed-node1/include_tasks 44109 1727204234.53665: done queuing things up, now waiting for results queue to drain 44109 1727204234.53666: waiting for pending results... 44109 1727204234.54404: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44109 1727204234.54733: in run() - task 028d2410-947f-ed67-a560-000000000017 44109 1727204234.54737: variable 'ansible_search_path' from source: unknown 44109 1727204234.54745: variable 'ansible_search_path' from source: unknown 44109 1727204234.55181: calling self._execute() 44109 1727204234.55185: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204234.55188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204234.55191: variable 'omit' from source: magic vars 44109 1727204234.55924: variable 'ansible_distribution_major_version' from source: facts 44109 1727204234.56282: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204234.56286: _execute() done 44109 1727204234.56288: dumping result to json 44109 1727204234.56291: done dumping result, returning 44109 1727204234.56293: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-ed67-a560-000000000017] 44109 1727204234.56296: sending task result for task 028d2410-947f-ed67-a560-000000000017 44109 1727204234.56370: done sending task result for task 028d2410-947f-ed67-a560-000000000017 44109 1727204234.56373: WORKER PROCESS EXITING 44109 1727204234.56418: no more pending results, returning what we have 44109 1727204234.56423: in VariableManager get_vars() 44109 1727204234.56471: Calling all_inventory to load vars for managed-node1 44109 1727204234.56474: Calling groups_inventory to load vars for managed-node1 44109 1727204234.56479: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204234.56491: Calling all_plugins_play to load vars for managed-node1 44109 1727204234.56494: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204234.56498: Calling groups_plugins_play to load vars for managed-node1 44109 1727204234.56893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204234.57522: done with get_vars() 44109 1727204234.57533: variable 'ansible_search_path' from source: unknown 44109 1727204234.57534: variable 'ansible_search_path' from source: unknown 44109 1727204234.57574: we have included files to process 44109 1727204234.57577: generating all_blocks data 44109 1727204234.57579: done generating all_blocks data 44109 1727204234.57585: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204234.57586: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204234.57588: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204234.59466: done processing included file 44109 1727204234.59468: iterating over new_blocks loaded from include file 44109 1727204234.59470: in VariableManager get_vars() 44109 1727204234.59526: done with get_vars() 44109 1727204234.59528: filtering new block on tags 44109 1727204234.59545: done filtering new block on tags 44109 1727204234.59548: in VariableManager get_vars() 44109 1727204234.59569: done with get_vars() 44109 1727204234.59570: filtering new block on tags 44109 1727204234.59795: done filtering new block on tags 44109 1727204234.59798: in VariableManager get_vars() 44109 1727204234.59820: done with get_vars() 44109 1727204234.59822: filtering new block on tags 44109 1727204234.59841: done filtering new block on tags 44109 1727204234.59843: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 44109 1727204234.59849: extending task lists for all hosts with included blocks 44109 1727204234.61863: done extending task lists 44109 1727204234.61865: done processing included files 44109 1727204234.61866: results queue empty 44109 1727204234.61866: checking for any_errors_fatal 44109 1727204234.61873: done checking for any_errors_fatal 44109 1727204234.61874: checking for max_fail_percentage 44109 1727204234.61877: done checking for max_fail_percentage 44109 1727204234.61878: checking to see if all hosts have failed and the running result is not ok 44109 1727204234.61879: done checking to see if all hosts have failed 44109 1727204234.61880: getting the remaining hosts for this loop 44109 1727204234.61881: done getting the remaining hosts for this loop 44109 1727204234.61883: getting the next task for host managed-node1 44109 1727204234.61888: done getting next task for host managed-node1 44109 1727204234.61891: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44109 1727204234.61894: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204234.61903: getting variables 44109 1727204234.61905: in VariableManager get_vars() 44109 1727204234.61921: Calling all_inventory to load vars for managed-node1 44109 1727204234.61924: Calling groups_inventory to load vars for managed-node1 44109 1727204234.61926: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204234.61932: Calling all_plugins_play to load vars for managed-node1 44109 1727204234.61934: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204234.61937: Calling groups_plugins_play to load vars for managed-node1 44109 1727204234.62332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204234.62763: done with get_vars() 44109 1727204234.62774: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:14 -0400 (0:00:00.100) 0:00:11.426 ***** 44109 1727204234.63054: entering _queue_task() for managed-node1/setup 44109 1727204234.63906: worker is 1 (out of 1 available) 44109 1727204234.63919: exiting _queue_task() for managed-node1/setup 44109 1727204234.63929: done queuing things up, now waiting for results queue to drain 44109 1727204234.63931: waiting for pending results... 44109 1727204234.64500: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44109 1727204234.65282: in run() - task 028d2410-947f-ed67-a560-0000000003d0 44109 1727204234.65286: variable 'ansible_search_path' from source: unknown 44109 1727204234.65289: variable 'ansible_search_path' from source: unknown 44109 1727204234.65292: calling self._execute() 44109 1727204234.65294: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204234.65297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204234.65299: variable 'omit' from source: magic vars 44109 1727204234.66782: variable 'ansible_distribution_major_version' from source: facts 44109 1727204234.66786: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204234.66934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204234.70056: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204234.70585: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204234.70722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204234.70761: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204234.70847: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204234.70984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204234.71040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204234.71151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204234.71206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204234.71294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204234.71417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204234.71523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204234.71589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204234.71640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204234.71664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204234.71834: variable '__network_required_facts' from source: role '' defaults 44109 1727204234.71849: variable 'ansible_facts' from source: unknown 44109 1727204234.71945: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44109 1727204234.71953: when evaluation is False, skipping this task 44109 1727204234.71960: _execute() done 44109 1727204234.71967: dumping result to json 44109 1727204234.71973: done dumping result, returning 44109 1727204234.71992: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-ed67-a560-0000000003d0] 44109 1727204234.72002: sending task result for task 028d2410-947f-ed67-a560-0000000003d0 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204234.72249: no more pending results, returning what we have 44109 1727204234.72253: results queue empty 44109 1727204234.72254: checking for any_errors_fatal 44109 1727204234.72256: done checking for any_errors_fatal 44109 1727204234.72257: checking for max_fail_percentage 44109 1727204234.72259: done checking for max_fail_percentage 44109 1727204234.72260: checking to see if all hosts have failed and the running result is not ok 44109 1727204234.72261: done checking to see if all hosts have failed 44109 1727204234.72261: getting the remaining hosts for this loop 44109 1727204234.72263: done getting the remaining hosts for this loop 44109 1727204234.72266: getting the next task for host managed-node1 44109 1727204234.72278: done getting next task for host managed-node1 44109 1727204234.72282: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44109 1727204234.72286: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204234.72299: getting variables 44109 1727204234.72301: in VariableManager get_vars() 44109 1727204234.72349: Calling all_inventory to load vars for managed-node1 44109 1727204234.72352: Calling groups_inventory to load vars for managed-node1 44109 1727204234.72355: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204234.72366: Calling all_plugins_play to load vars for managed-node1 44109 1727204234.72368: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204234.72371: Calling groups_plugins_play to load vars for managed-node1 44109 1727204234.72432: done sending task result for task 028d2410-947f-ed67-a560-0000000003d0 44109 1727204234.72436: WORKER PROCESS EXITING 44109 1727204234.72868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204234.73091: done with get_vars() 44109 1727204234.73120: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:14 -0400 (0:00:00.101) 0:00:11.528 ***** 44109 1727204234.73230: entering _queue_task() for managed-node1/stat 44109 1727204234.73508: worker is 1 (out of 1 available) 44109 1727204234.73522: exiting _queue_task() for managed-node1/stat 44109 1727204234.73533: done queuing things up, now waiting for results queue to drain 44109 1727204234.73534: waiting for pending results... 44109 1727204234.73818: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 44109 1727204234.73964: in run() - task 028d2410-947f-ed67-a560-0000000003d2 44109 1727204234.73992: variable 'ansible_search_path' from source: unknown 44109 1727204234.74002: variable 'ansible_search_path' from source: unknown 44109 1727204234.74042: calling self._execute() 44109 1727204234.74181: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204234.74185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204234.74187: variable 'omit' from source: magic vars 44109 1727204234.75083: variable 'ansible_distribution_major_version' from source: facts 44109 1727204234.75087: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204234.75269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204234.76044: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204234.76094: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204234.76345: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204234.76348: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204234.76563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204234.76566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204234.76569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204234.76691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204234.76889: variable '__network_is_ostree' from source: set_fact 44109 1727204234.76900: Evaluated conditional (not __network_is_ostree is defined): False 44109 1727204234.76906: when evaluation is False, skipping this task 44109 1727204234.76914: _execute() done 44109 1727204234.76920: dumping result to json 44109 1727204234.76925: done dumping result, returning 44109 1727204234.76934: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-ed67-a560-0000000003d2] 44109 1727204234.76941: sending task result for task 028d2410-947f-ed67-a560-0000000003d2 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44109 1727204234.77080: no more pending results, returning what we have 44109 1727204234.77085: results queue empty 44109 1727204234.77086: checking for any_errors_fatal 44109 1727204234.77094: done checking for any_errors_fatal 44109 1727204234.77095: checking for max_fail_percentage 44109 1727204234.77097: done checking for max_fail_percentage 44109 1727204234.77097: checking to see if all hosts have failed and the running result is not ok 44109 1727204234.77098: done checking to see if all hosts have failed 44109 1727204234.77099: getting the remaining hosts for this loop 44109 1727204234.77101: done getting the remaining hosts for this loop 44109 1727204234.77106: getting the next task for host managed-node1 44109 1727204234.77116: done getting next task for host managed-node1 44109 1727204234.77120: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44109 1727204234.77124: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204234.77139: getting variables 44109 1727204234.77141: in VariableManager get_vars() 44109 1727204234.77184: Calling all_inventory to load vars for managed-node1 44109 1727204234.77187: Calling groups_inventory to load vars for managed-node1 44109 1727204234.77190: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204234.77201: Calling all_plugins_play to load vars for managed-node1 44109 1727204234.77203: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204234.77206: Calling groups_plugins_play to load vars for managed-node1 44109 1727204234.77752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204234.78683: done sending task result for task 028d2410-947f-ed67-a560-0000000003d2 44109 1727204234.78686: WORKER PROCESS EXITING 44109 1727204234.78969: done with get_vars() 44109 1727204234.78983: done getting variables 44109 1727204234.79048: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:14 -0400 (0:00:00.062) 0:00:11.591 ***** 44109 1727204234.79494: entering _queue_task() for managed-node1/set_fact 44109 1727204234.80221: worker is 1 (out of 1 available) 44109 1727204234.80234: exiting _queue_task() for managed-node1/set_fact 44109 1727204234.80246: done queuing things up, now waiting for results queue to drain 44109 1727204234.80247: waiting for pending results... 44109 1727204234.80736: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44109 1727204234.80969: in run() - task 028d2410-947f-ed67-a560-0000000003d3 44109 1727204234.80985: variable 'ansible_search_path' from source: unknown 44109 1727204234.80988: variable 'ansible_search_path' from source: unknown 44109 1727204234.81100: calling self._execute() 44109 1727204234.81179: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204234.81183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204234.81194: variable 'omit' from source: magic vars 44109 1727204234.81968: variable 'ansible_distribution_major_version' from source: facts 44109 1727204234.82001: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204234.82342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204234.82911: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204234.83185: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204234.83189: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204234.83192: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204234.83380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204234.83411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204234.83532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204234.83566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204234.83824: variable '__network_is_ostree' from source: set_fact 44109 1727204234.83837: Evaluated conditional (not __network_is_ostree is defined): False 44109 1727204234.83845: when evaluation is False, skipping this task 44109 1727204234.84000: _execute() done 44109 1727204234.84003: dumping result to json 44109 1727204234.84006: done dumping result, returning 44109 1727204234.84009: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-ed67-a560-0000000003d3] 44109 1727204234.84012: sending task result for task 028d2410-947f-ed67-a560-0000000003d3 44109 1727204234.84146: done sending task result for task 028d2410-947f-ed67-a560-0000000003d3 44109 1727204234.84149: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44109 1727204234.84228: no more pending results, returning what we have 44109 1727204234.84232: results queue empty 44109 1727204234.84232: checking for any_errors_fatal 44109 1727204234.84238: done checking for any_errors_fatal 44109 1727204234.84239: checking for max_fail_percentage 44109 1727204234.84241: done checking for max_fail_percentage 44109 1727204234.84241: checking to see if all hosts have failed and the running result is not ok 44109 1727204234.84242: done checking to see if all hosts have failed 44109 1727204234.84243: getting the remaining hosts for this loop 44109 1727204234.84244: done getting the remaining hosts for this loop 44109 1727204234.84248: getting the next task for host managed-node1 44109 1727204234.84257: done getting next task for host managed-node1 44109 1727204234.84260: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44109 1727204234.84264: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204234.84278: getting variables 44109 1727204234.84280: in VariableManager get_vars() 44109 1727204234.84319: Calling all_inventory to load vars for managed-node1 44109 1727204234.84322: Calling groups_inventory to load vars for managed-node1 44109 1727204234.84324: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204234.84334: Calling all_plugins_play to load vars for managed-node1 44109 1727204234.84337: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204234.84339: Calling groups_plugins_play to load vars for managed-node1 44109 1727204234.84635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204234.85058: done with get_vars() 44109 1727204234.85070: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:14 -0400 (0:00:00.058) 0:00:11.649 ***** 44109 1727204234.85298: entering _queue_task() for managed-node1/service_facts 44109 1727204234.85300: Creating lock for service_facts 44109 1727204234.85925: worker is 1 (out of 1 available) 44109 1727204234.85936: exiting _queue_task() for managed-node1/service_facts 44109 1727204234.85946: done queuing things up, now waiting for results queue to drain 44109 1727204234.85947: waiting for pending results... 44109 1727204234.86701: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 44109 1727204234.86787: in run() - task 028d2410-947f-ed67-a560-0000000003d5 44109 1727204234.86811: variable 'ansible_search_path' from source: unknown 44109 1727204234.86820: variable 'ansible_search_path' from source: unknown 44109 1727204234.86860: calling self._execute() 44109 1727204234.87121: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204234.87125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204234.87127: variable 'omit' from source: magic vars 44109 1727204234.87754: variable 'ansible_distribution_major_version' from source: facts 44109 1727204234.87822: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204234.87833: variable 'omit' from source: magic vars 44109 1727204234.88028: variable 'omit' from source: magic vars 44109 1727204234.88068: variable 'omit' from source: magic vars 44109 1727204234.88113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204234.88354: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204234.88357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204234.88508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204234.88519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204234.88549: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204234.88552: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204234.88554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204234.88768: Set connection var ansible_connection to ssh 44109 1727204234.88797: Set connection var ansible_timeout to 10 44109 1727204234.88806: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204234.88892: Set connection var ansible_pipelining to False 44109 1727204234.89008: Set connection var ansible_shell_executable to /bin/sh 44109 1727204234.89011: Set connection var ansible_shell_type to sh 44109 1727204234.89013: variable 'ansible_shell_executable' from source: unknown 44109 1727204234.89015: variable 'ansible_connection' from source: unknown 44109 1727204234.89018: variable 'ansible_module_compression' from source: unknown 44109 1727204234.89019: variable 'ansible_shell_type' from source: unknown 44109 1727204234.89022: variable 'ansible_shell_executable' from source: unknown 44109 1727204234.89023: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204234.89025: variable 'ansible_pipelining' from source: unknown 44109 1727204234.89027: variable 'ansible_timeout' from source: unknown 44109 1727204234.89029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204234.89380: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204234.89385: variable 'omit' from source: magic vars 44109 1727204234.89387: starting attempt loop 44109 1727204234.89389: running the handler 44109 1727204234.89391: _low_level_execute_command(): starting 44109 1727204234.89660: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204234.91029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204234.91124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204234.91152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204234.91193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204234.91354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204234.93185: stdout chunk (state=3): >>>/root <<< 44109 1727204234.93300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204234.93327: stdout chunk (state=3): >>><<< 44109 1727204234.93396: stderr chunk (state=3): >>><<< 44109 1727204234.93421: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204234.93455: _low_level_execute_command(): starting 44109 1727204234.93510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151 `" && echo ansible-tmp-1727204234.934404-45333-3305902098151="` echo /root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151 `" ) && sleep 0' 44109 1727204234.94312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204234.94325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204234.94417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204234.94516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204234.94597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204234.96652: stdout chunk (state=3): >>>ansible-tmp-1727204234.934404-45333-3305902098151=/root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151 <<< 44109 1727204234.97214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204234.97217: stdout chunk (state=3): >>><<< 44109 1727204234.97219: stderr chunk (state=3): >>><<< 44109 1727204234.97221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204234.934404-45333-3305902098151=/root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204234.97223: variable 'ansible_module_compression' from source: unknown 44109 1727204234.97224: ANSIBALLZ: Using lock for service_facts 44109 1727204234.97226: ANSIBALLZ: Acquiring lock 44109 1727204234.97228: ANSIBALLZ: Lock acquired: 139907463578544 44109 1727204234.97229: ANSIBALLZ: Creating module 44109 1727204235.15984: ANSIBALLZ: Writing module into payload 44109 1727204235.15988: ANSIBALLZ: Writing module 44109 1727204235.15990: ANSIBALLZ: Renaming module 44109 1727204235.15992: ANSIBALLZ: Done creating module 44109 1727204235.15994: variable 'ansible_facts' from source: unknown 44109 1727204235.16018: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/AnsiballZ_service_facts.py 44109 1727204235.16158: Sending initial data 44109 1727204235.16161: Sent initial data (159 bytes) 44109 1727204235.16891: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204235.16931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204235.16942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204235.17230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204235.17241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204235.19114: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204235.19261: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204235.19340: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpcy_b1qnq /root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/AnsiballZ_service_facts.py <<< 44109 1727204235.19346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/AnsiballZ_service_facts.py" <<< 44109 1727204235.19681: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpcy_b1qnq" to remote "/root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/AnsiballZ_service_facts.py" <<< 44109 1727204235.21070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204235.21074: stdout chunk (state=3): >>><<< 44109 1727204235.21084: stderr chunk (state=3): >>><<< 44109 1727204235.21267: done transferring module to remote 44109 1727204235.21279: _low_level_execute_command(): starting 44109 1727204235.21288: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/ /root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/AnsiballZ_service_facts.py && sleep 0' 44109 1727204235.22446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204235.22450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204235.22452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44109 1727204235.22455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204235.22482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204235.22486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204235.22532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204235.22538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204235.22645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204235.24641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204235.24652: stdout chunk (state=3): >>><<< 44109 1727204235.24663: stderr chunk (state=3): >>><<< 44109 1727204235.24685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204235.24693: _low_level_execute_command(): starting 44109 1727204235.24701: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/AnsiballZ_service_facts.py && sleep 0' 44109 1727204235.25738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204235.25741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204235.25752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204235.25755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204235.25757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204235.25810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204235.25813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204235.25829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204235.25947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204237.02988: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 44109 1727204237.03017: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 44109 1727204237.03022: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 44109 1727204237.03047: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 44109 1727204237.03053: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44109 1727204237.04789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204237.04822: stderr chunk (state=3): >>><<< 44109 1727204237.04825: stdout chunk (state=3): >>><<< 44109 1727204237.04851: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204237.05226: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204237.05234: _low_level_execute_command(): starting 44109 1727204237.05239: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204234.934404-45333-3305902098151/ > /dev/null 2>&1 && sleep 0' 44109 1727204237.05670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204237.05679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204237.05707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.05710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204237.05714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.05765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204237.05769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204237.05774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204237.05853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204237.07851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204237.07883: stderr chunk (state=3): >>><<< 44109 1727204237.07887: stdout chunk (state=3): >>><<< 44109 1727204237.07899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204237.07905: handler run complete 44109 1727204237.08019: variable 'ansible_facts' from source: unknown 44109 1727204237.08116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204237.08370: variable 'ansible_facts' from source: unknown 44109 1727204237.08456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204237.08569: attempt loop complete, returning result 44109 1727204237.08573: _execute() done 44109 1727204237.08577: dumping result to json 44109 1727204237.08617: done dumping result, returning 44109 1727204237.08623: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-ed67-a560-0000000003d5] 44109 1727204237.08628: sending task result for task 028d2410-947f-ed67-a560-0000000003d5 44109 1727204237.09347: done sending task result for task 028d2410-947f-ed67-a560-0000000003d5 44109 1727204237.09350: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204237.09389: no more pending results, returning what we have 44109 1727204237.09391: results queue empty 44109 1727204237.09392: checking for any_errors_fatal 44109 1727204237.09394: done checking for any_errors_fatal 44109 1727204237.09394: checking for max_fail_percentage 44109 1727204237.09395: done checking for max_fail_percentage 44109 1727204237.09396: checking to see if all hosts have failed and the running result is not ok 44109 1727204237.09396: done checking to see if all hosts have failed 44109 1727204237.09397: getting the remaining hosts for this loop 44109 1727204237.09397: done getting the remaining hosts for this loop 44109 1727204237.09400: getting the next task for host managed-node1 44109 1727204237.09405: done getting next task for host managed-node1 44109 1727204237.09408: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44109 1727204237.09411: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204237.09418: getting variables 44109 1727204237.09419: in VariableManager get_vars() 44109 1727204237.09439: Calling all_inventory to load vars for managed-node1 44109 1727204237.09441: Calling groups_inventory to load vars for managed-node1 44109 1727204237.09442: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204237.09448: Calling all_plugins_play to load vars for managed-node1 44109 1727204237.09449: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204237.09451: Calling groups_plugins_play to load vars for managed-node1 44109 1727204237.09668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204237.09944: done with get_vars() 44109 1727204237.09955: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:17 -0400 (0:00:02.247) 0:00:13.896 ***** 44109 1727204237.10026: entering _queue_task() for managed-node1/package_facts 44109 1727204237.10027: Creating lock for package_facts 44109 1727204237.10251: worker is 1 (out of 1 available) 44109 1727204237.10264: exiting _queue_task() for managed-node1/package_facts 44109 1727204237.10276: done queuing things up, now waiting for results queue to drain 44109 1727204237.10277: waiting for pending results... 44109 1727204237.10451: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 44109 1727204237.10552: in run() - task 028d2410-947f-ed67-a560-0000000003d6 44109 1727204237.10563: variable 'ansible_search_path' from source: unknown 44109 1727204237.10566: variable 'ansible_search_path' from source: unknown 44109 1727204237.10595: calling self._execute() 44109 1727204237.10662: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204237.10666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204237.10674: variable 'omit' from source: magic vars 44109 1727204237.10946: variable 'ansible_distribution_major_version' from source: facts 44109 1727204237.10956: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204237.10962: variable 'omit' from source: magic vars 44109 1727204237.11006: variable 'omit' from source: magic vars 44109 1727204237.11033: variable 'omit' from source: magic vars 44109 1727204237.11066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204237.11094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204237.11110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204237.11127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204237.11136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204237.11162: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204237.11166: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204237.11168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204237.11239: Set connection var ansible_connection to ssh 44109 1727204237.11242: Set connection var ansible_timeout to 10 44109 1727204237.11248: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204237.11254: Set connection var ansible_pipelining to False 44109 1727204237.11261: Set connection var ansible_shell_executable to /bin/sh 44109 1727204237.11263: Set connection var ansible_shell_type to sh 44109 1727204237.11284: variable 'ansible_shell_executable' from source: unknown 44109 1727204237.11288: variable 'ansible_connection' from source: unknown 44109 1727204237.11291: variable 'ansible_module_compression' from source: unknown 44109 1727204237.11294: variable 'ansible_shell_type' from source: unknown 44109 1727204237.11296: variable 'ansible_shell_executable' from source: unknown 44109 1727204237.11299: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204237.11302: variable 'ansible_pipelining' from source: unknown 44109 1727204237.11305: variable 'ansible_timeout' from source: unknown 44109 1727204237.11307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204237.11448: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204237.11457: variable 'omit' from source: magic vars 44109 1727204237.11462: starting attempt loop 44109 1727204237.11465: running the handler 44109 1727204237.11478: _low_level_execute_command(): starting 44109 1727204237.11490: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204237.12005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204237.12009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204237.12012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.12071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204237.12082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204237.12084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204237.12154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204237.13945: stdout chunk (state=3): >>>/root <<< 44109 1727204237.14045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204237.14073: stderr chunk (state=3): >>><<< 44109 1727204237.14078: stdout chunk (state=3): >>><<< 44109 1727204237.14099: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204237.14112: _low_level_execute_command(): starting 44109 1727204237.14119: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590 `" && echo ansible-tmp-1727204237.1409762-45604-21606344571590="` echo /root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590 `" ) && sleep 0' 44109 1727204237.14550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204237.14553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.14556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204237.14566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.14620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204237.14623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204237.14625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204237.14706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204237.16786: stdout chunk (state=3): >>>ansible-tmp-1727204237.1409762-45604-21606344571590=/root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590 <<< 44109 1727204237.16893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204237.16922: stderr chunk (state=3): >>><<< 44109 1727204237.16925: stdout chunk (state=3): >>><<< 44109 1727204237.16942: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204237.1409762-45604-21606344571590=/root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204237.16982: variable 'ansible_module_compression' from source: unknown 44109 1727204237.17024: ANSIBALLZ: Using lock for package_facts 44109 1727204237.17027: ANSIBALLZ: Acquiring lock 44109 1727204237.17030: ANSIBALLZ: Lock acquired: 139907462678144 44109 1727204237.17033: ANSIBALLZ: Creating module 44109 1727204237.35193: ANSIBALLZ: Writing module into payload 44109 1727204237.35283: ANSIBALLZ: Writing module 44109 1727204237.35304: ANSIBALLZ: Renaming module 44109 1727204237.35309: ANSIBALLZ: Done creating module 44109 1727204237.35339: variable 'ansible_facts' from source: unknown 44109 1727204237.35455: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/AnsiballZ_package_facts.py 44109 1727204237.35560: Sending initial data 44109 1727204237.35564: Sent initial data (161 bytes) 44109 1727204237.36021: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204237.36025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204237.36027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.36029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204237.36046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.36081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204237.36094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204237.36188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204237.37928: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204237.38002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204237.38083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpipz5tq_4 /root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/AnsiballZ_package_facts.py <<< 44109 1727204237.38087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/AnsiballZ_package_facts.py" <<< 44109 1727204237.38151: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpipz5tq_4" to remote "/root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/AnsiballZ_package_facts.py" <<< 44109 1727204237.38154: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/AnsiballZ_package_facts.py" <<< 44109 1727204237.39358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204237.39403: stderr chunk (state=3): >>><<< 44109 1727204237.39406: stdout chunk (state=3): >>><<< 44109 1727204237.39443: done transferring module to remote 44109 1727204237.39451: _low_level_execute_command(): starting 44109 1727204237.39456: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/ /root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/AnsiballZ_package_facts.py && sleep 0' 44109 1727204237.39909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204237.39913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204237.39915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.39917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204237.39925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.39977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204237.39981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204237.39986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204237.40065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204237.42030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204237.42053: stderr chunk (state=3): >>><<< 44109 1727204237.42056: stdout chunk (state=3): >>><<< 44109 1727204237.42068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204237.42071: _low_level_execute_command(): starting 44109 1727204237.42081: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/AnsiballZ_package_facts.py && sleep 0' 44109 1727204237.42520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204237.42524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204237.42526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.42528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204237.42530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.42573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204237.42579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204237.42673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204237.89826: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 44109 1727204237.89856: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 44109 1727204237.89873: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 44109 1727204237.89904: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 44109 1727204237.89918: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "<<< 44109 1727204237.89940: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 44109 1727204237.89961: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 44109 1727204237.89982: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 44109 1727204237.90009: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 44109 1727204237.90025: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 44109 1727204237.90039: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44109 1727204237.92180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204237.92207: stderr chunk (state=3): >>><<< 44109 1727204237.92211: stdout chunk (state=3): >>><<< 44109 1727204237.92251: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204237.94485: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204237.94489: _low_level_execute_command(): starting 44109 1727204237.94492: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204237.1409762-45604-21606344571590/ > /dev/null 2>&1 && sleep 0' 44109 1727204237.95056: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204237.95071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204237.95093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204237.95114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204237.95131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204237.95142: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204237.95156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204237.95173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204237.95189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204237.95199: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204237.95279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204237.95303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204237.95417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204237.97431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204237.97498: stderr chunk (state=3): >>><<< 44109 1727204237.97510: stdout chunk (state=3): >>><<< 44109 1727204237.97532: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204237.97549: handler run complete 44109 1727204237.98409: variable 'ansible_facts' from source: unknown 44109 1727204237.98856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.04907: variable 'ansible_facts' from source: unknown 44109 1727204238.05448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.06079: attempt loop complete, returning result 44109 1727204238.06100: _execute() done 44109 1727204238.06116: dumping result to json 44109 1727204238.06342: done dumping result, returning 44109 1727204238.06355: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-ed67-a560-0000000003d6] 44109 1727204238.06364: sending task result for task 028d2410-947f-ed67-a560-0000000003d6 44109 1727204238.08659: done sending task result for task 028d2410-947f-ed67-a560-0000000003d6 44109 1727204238.08662: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204238.08762: no more pending results, returning what we have 44109 1727204238.08765: results queue empty 44109 1727204238.08766: checking for any_errors_fatal 44109 1727204238.08770: done checking for any_errors_fatal 44109 1727204238.08771: checking for max_fail_percentage 44109 1727204238.08772: done checking for max_fail_percentage 44109 1727204238.08773: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.08774: done checking to see if all hosts have failed 44109 1727204238.08777: getting the remaining hosts for this loop 44109 1727204238.08778: done getting the remaining hosts for this loop 44109 1727204238.08782: getting the next task for host managed-node1 44109 1727204238.08788: done getting next task for host managed-node1 44109 1727204238.08792: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44109 1727204238.08796: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.08806: getting variables 44109 1727204238.08807: in VariableManager get_vars() 44109 1727204238.08838: Calling all_inventory to load vars for managed-node1 44109 1727204238.08841: Calling groups_inventory to load vars for managed-node1 44109 1727204238.08844: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.08852: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.08855: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.08857: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.09658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.10534: done with get_vars() 44109 1727204238.10551: done getting variables 44109 1727204238.10614: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:18 -0400 (0:00:01.006) 0:00:14.902 ***** 44109 1727204238.10650: entering _queue_task() for managed-node1/debug 44109 1727204238.10931: worker is 1 (out of 1 available) 44109 1727204238.10946: exiting _queue_task() for managed-node1/debug 44109 1727204238.10957: done queuing things up, now waiting for results queue to drain 44109 1727204238.10958: waiting for pending results... 44109 1727204238.11142: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 44109 1727204238.11225: in run() - task 028d2410-947f-ed67-a560-000000000018 44109 1727204238.11239: variable 'ansible_search_path' from source: unknown 44109 1727204238.11243: variable 'ansible_search_path' from source: unknown 44109 1727204238.11270: calling self._execute() 44109 1727204238.11344: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.11348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.11356: variable 'omit' from source: magic vars 44109 1727204238.11881: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.11885: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.11888: variable 'omit' from source: magic vars 44109 1727204238.11890: variable 'omit' from source: magic vars 44109 1727204238.11892: variable 'network_provider' from source: set_fact 44109 1727204238.11894: variable 'omit' from source: magic vars 44109 1727204238.11924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204238.11963: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204238.11989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204238.12015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204238.12033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204238.12069: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204238.12074: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.12080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.12151: Set connection var ansible_connection to ssh 44109 1727204238.12155: Set connection var ansible_timeout to 10 44109 1727204238.12161: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204238.12168: Set connection var ansible_pipelining to False 44109 1727204238.12173: Set connection var ansible_shell_executable to /bin/sh 44109 1727204238.12178: Set connection var ansible_shell_type to sh 44109 1727204238.12195: variable 'ansible_shell_executable' from source: unknown 44109 1727204238.12198: variable 'ansible_connection' from source: unknown 44109 1727204238.12201: variable 'ansible_module_compression' from source: unknown 44109 1727204238.12204: variable 'ansible_shell_type' from source: unknown 44109 1727204238.12206: variable 'ansible_shell_executable' from source: unknown 44109 1727204238.12208: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.12211: variable 'ansible_pipelining' from source: unknown 44109 1727204238.12217: variable 'ansible_timeout' from source: unknown 44109 1727204238.12220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.12324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204238.12337: variable 'omit' from source: magic vars 44109 1727204238.12340: starting attempt loop 44109 1727204238.12343: running the handler 44109 1727204238.12373: handler run complete 44109 1727204238.12385: attempt loop complete, returning result 44109 1727204238.12388: _execute() done 44109 1727204238.12391: dumping result to json 44109 1727204238.12393: done dumping result, returning 44109 1727204238.12400: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-ed67-a560-000000000018] 44109 1727204238.12402: sending task result for task 028d2410-947f-ed67-a560-000000000018 ok: [managed-node1] => {} MSG: Using network provider: nm 44109 1727204238.12534: no more pending results, returning what we have 44109 1727204238.12538: results queue empty 44109 1727204238.12539: checking for any_errors_fatal 44109 1727204238.12553: done checking for any_errors_fatal 44109 1727204238.12554: checking for max_fail_percentage 44109 1727204238.12556: done checking for max_fail_percentage 44109 1727204238.12556: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.12557: done checking to see if all hosts have failed 44109 1727204238.12558: getting the remaining hosts for this loop 44109 1727204238.12559: done getting the remaining hosts for this loop 44109 1727204238.12562: getting the next task for host managed-node1 44109 1727204238.12569: done getting next task for host managed-node1 44109 1727204238.12572: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44109 1727204238.12575: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.12586: getting variables 44109 1727204238.12588: in VariableManager get_vars() 44109 1727204238.12621: Calling all_inventory to load vars for managed-node1 44109 1727204238.12624: Calling groups_inventory to load vars for managed-node1 44109 1727204238.12626: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.12634: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.12637: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.12639: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.13392: done sending task result for task 028d2410-947f-ed67-a560-000000000018 44109 1727204238.13397: WORKER PROCESS EXITING 44109 1727204238.13407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.14260: done with get_vars() 44109 1727204238.14281: done getting variables 44109 1727204238.14324: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.036) 0:00:14.939 ***** 44109 1727204238.14349: entering _queue_task() for managed-node1/fail 44109 1727204238.14588: worker is 1 (out of 1 available) 44109 1727204238.14601: exiting _queue_task() for managed-node1/fail 44109 1727204238.14613: done queuing things up, now waiting for results queue to drain 44109 1727204238.14614: waiting for pending results... 44109 1727204238.14788: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44109 1727204238.14877: in run() - task 028d2410-947f-ed67-a560-000000000019 44109 1727204238.14889: variable 'ansible_search_path' from source: unknown 44109 1727204238.14893: variable 'ansible_search_path' from source: unknown 44109 1727204238.14924: calling self._execute() 44109 1727204238.14991: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.14996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.15003: variable 'omit' from source: magic vars 44109 1727204238.15282: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.15291: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.15373: variable 'network_state' from source: role '' defaults 44109 1727204238.15386: Evaluated conditional (network_state != {}): False 44109 1727204238.15389: when evaluation is False, skipping this task 44109 1727204238.15392: _execute() done 44109 1727204238.15394: dumping result to json 44109 1727204238.15397: done dumping result, returning 44109 1727204238.15400: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-ed67-a560-000000000019] 44109 1727204238.15407: sending task result for task 028d2410-947f-ed67-a560-000000000019 44109 1727204238.15495: done sending task result for task 028d2410-947f-ed67-a560-000000000019 44109 1727204238.15498: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204238.15544: no more pending results, returning what we have 44109 1727204238.15548: results queue empty 44109 1727204238.15549: checking for any_errors_fatal 44109 1727204238.15555: done checking for any_errors_fatal 44109 1727204238.15556: checking for max_fail_percentage 44109 1727204238.15558: done checking for max_fail_percentage 44109 1727204238.15558: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.15559: done checking to see if all hosts have failed 44109 1727204238.15560: getting the remaining hosts for this loop 44109 1727204238.15561: done getting the remaining hosts for this loop 44109 1727204238.15564: getting the next task for host managed-node1 44109 1727204238.15571: done getting next task for host managed-node1 44109 1727204238.15574: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44109 1727204238.15581: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.15598: getting variables 44109 1727204238.15599: in VariableManager get_vars() 44109 1727204238.15635: Calling all_inventory to load vars for managed-node1 44109 1727204238.15638: Calling groups_inventory to load vars for managed-node1 44109 1727204238.15640: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.15649: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.15652: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.15654: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.16510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.17361: done with get_vars() 44109 1727204238.17380: done getting variables 44109 1727204238.17426: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.030) 0:00:14.970 ***** 44109 1727204238.17451: entering _queue_task() for managed-node1/fail 44109 1727204238.17688: worker is 1 (out of 1 available) 44109 1727204238.17703: exiting _queue_task() for managed-node1/fail 44109 1727204238.17714: done queuing things up, now waiting for results queue to drain 44109 1727204238.17718: waiting for pending results... 44109 1727204238.18004: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44109 1727204238.18057: in run() - task 028d2410-947f-ed67-a560-00000000001a 44109 1727204238.18079: variable 'ansible_search_path' from source: unknown 44109 1727204238.18088: variable 'ansible_search_path' from source: unknown 44109 1727204238.18132: calling self._execute() 44109 1727204238.18225: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.18234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.18245: variable 'omit' from source: magic vars 44109 1727204238.18604: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.18621: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.18737: variable 'network_state' from source: role '' defaults 44109 1727204238.18858: Evaluated conditional (network_state != {}): False 44109 1727204238.18861: when evaluation is False, skipping this task 44109 1727204238.18863: _execute() done 44109 1727204238.18865: dumping result to json 44109 1727204238.18868: done dumping result, returning 44109 1727204238.18870: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-ed67-a560-00000000001a] 44109 1727204238.18872: sending task result for task 028d2410-947f-ed67-a560-00000000001a 44109 1727204238.18938: done sending task result for task 028d2410-947f-ed67-a560-00000000001a 44109 1727204238.18941: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204238.19005: no more pending results, returning what we have 44109 1727204238.19008: results queue empty 44109 1727204238.19009: checking for any_errors_fatal 44109 1727204238.19018: done checking for any_errors_fatal 44109 1727204238.19019: checking for max_fail_percentage 44109 1727204238.19020: done checking for max_fail_percentage 44109 1727204238.19021: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.19022: done checking to see if all hosts have failed 44109 1727204238.19022: getting the remaining hosts for this loop 44109 1727204238.19023: done getting the remaining hosts for this loop 44109 1727204238.19027: getting the next task for host managed-node1 44109 1727204238.19033: done getting next task for host managed-node1 44109 1727204238.19037: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44109 1727204238.19040: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.19056: getting variables 44109 1727204238.19058: in VariableManager get_vars() 44109 1727204238.19094: Calling all_inventory to load vars for managed-node1 44109 1727204238.19096: Calling groups_inventory to load vars for managed-node1 44109 1727204238.19099: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.19109: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.19111: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.19114: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.20143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.21096: done with get_vars() 44109 1727204238.21115: done getting variables 44109 1727204238.21157: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.037) 0:00:15.008 ***** 44109 1727204238.21183: entering _queue_task() for managed-node1/fail 44109 1727204238.21417: worker is 1 (out of 1 available) 44109 1727204238.21429: exiting _queue_task() for managed-node1/fail 44109 1727204238.21441: done queuing things up, now waiting for results queue to drain 44109 1727204238.21441: waiting for pending results... 44109 1727204238.21618: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44109 1727204238.21702: in run() - task 028d2410-947f-ed67-a560-00000000001b 44109 1727204238.21717: variable 'ansible_search_path' from source: unknown 44109 1727204238.21721: variable 'ansible_search_path' from source: unknown 44109 1727204238.21746: calling self._execute() 44109 1727204238.21816: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.21820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.21826: variable 'omit' from source: magic vars 44109 1727204238.22086: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.22097: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.22219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204238.23681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204238.23735: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204238.23762: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204238.23788: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204238.23808: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204238.23867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.23890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.23908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.23936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.23947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.24022: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.24034: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44109 1727204238.24117: variable 'ansible_distribution' from source: facts 44109 1727204238.24121: variable '__network_rh_distros' from source: role '' defaults 44109 1727204238.24129: Evaluated conditional (ansible_distribution in __network_rh_distros): True 44109 1727204238.24287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.24305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.24324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.24349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.24360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.24395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.24412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.24431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.24454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.24464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.24497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.24513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.24531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.24555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.24565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.24756: variable 'network_connections' from source: task vars 44109 1727204238.24765: variable 'interface' from source: set_fact 44109 1727204238.24815: variable 'interface' from source: set_fact 44109 1727204238.24823: variable 'interface' from source: set_fact 44109 1727204238.24864: variable 'interface' from source: set_fact 44109 1727204238.24890: variable 'network_state' from source: role '' defaults 44109 1727204238.24938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204238.25050: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204238.25078: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204238.25100: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204238.25125: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204238.25157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204238.25177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204238.25196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.25213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204238.25242: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 44109 1727204238.25245: when evaluation is False, skipping this task 44109 1727204238.25248: _execute() done 44109 1727204238.25252: dumping result to json 44109 1727204238.25254: done dumping result, returning 44109 1727204238.25262: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-ed67-a560-00000000001b] 44109 1727204238.25265: sending task result for task 028d2410-947f-ed67-a560-00000000001b 44109 1727204238.25341: done sending task result for task 028d2410-947f-ed67-a560-00000000001b 44109 1727204238.25344: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 44109 1727204238.25410: no more pending results, returning what we have 44109 1727204238.25413: results queue empty 44109 1727204238.25414: checking for any_errors_fatal 44109 1727204238.25421: done checking for any_errors_fatal 44109 1727204238.25421: checking for max_fail_percentage 44109 1727204238.25423: done checking for max_fail_percentage 44109 1727204238.25424: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.25425: done checking to see if all hosts have failed 44109 1727204238.25425: getting the remaining hosts for this loop 44109 1727204238.25427: done getting the remaining hosts for this loop 44109 1727204238.25430: getting the next task for host managed-node1 44109 1727204238.25436: done getting next task for host managed-node1 44109 1727204238.25439: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44109 1727204238.25442: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.25456: getting variables 44109 1727204238.25457: in VariableManager get_vars() 44109 1727204238.25494: Calling all_inventory to load vars for managed-node1 44109 1727204238.25497: Calling groups_inventory to load vars for managed-node1 44109 1727204238.25499: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.25508: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.25511: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.25513: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.26307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.27172: done with get_vars() 44109 1727204238.27191: done getting variables 44109 1727204238.27261: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.061) 0:00:15.069 ***** 44109 1727204238.27285: entering _queue_task() for managed-node1/dnf 44109 1727204238.27511: worker is 1 (out of 1 available) 44109 1727204238.27523: exiting _queue_task() for managed-node1/dnf 44109 1727204238.27533: done queuing things up, now waiting for results queue to drain 44109 1727204238.27534: waiting for pending results... 44109 1727204238.27704: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44109 1727204238.27789: in run() - task 028d2410-947f-ed67-a560-00000000001c 44109 1727204238.27802: variable 'ansible_search_path' from source: unknown 44109 1727204238.27805: variable 'ansible_search_path' from source: unknown 44109 1727204238.27836: calling self._execute() 44109 1727204238.27905: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.27909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.27921: variable 'omit' from source: magic vars 44109 1727204238.28179: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.28188: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.28325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204238.29831: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204238.29883: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204238.29912: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204238.29941: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204238.29961: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204238.30021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.30043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.30061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.30088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.30099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.30182: variable 'ansible_distribution' from source: facts 44109 1727204238.30185: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.30198: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44109 1727204238.30279: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204238.30362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.30381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.30398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.30425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.30436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.30462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.30481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.30498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.30525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.30535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.30562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.30581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.30599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.30625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.30635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.30738: variable 'network_connections' from source: task vars 44109 1727204238.30747: variable 'interface' from source: set_fact 44109 1727204238.30796: variable 'interface' from source: set_fact 44109 1727204238.30804: variable 'interface' from source: set_fact 44109 1727204238.30847: variable 'interface' from source: set_fact 44109 1727204238.30911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204238.31028: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204238.31054: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204238.31077: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204238.31099: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204238.31137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204238.31150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204238.31172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.31191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204238.31236: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204238.31405: variable 'network_connections' from source: task vars 44109 1727204238.31408: variable 'interface' from source: set_fact 44109 1727204238.31451: variable 'interface' from source: set_fact 44109 1727204238.31462: variable 'interface' from source: set_fact 44109 1727204238.31501: variable 'interface' from source: set_fact 44109 1727204238.31539: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204238.31543: when evaluation is False, skipping this task 44109 1727204238.31545: _execute() done 44109 1727204238.31548: dumping result to json 44109 1727204238.31550: done dumping result, returning 44109 1727204238.31557: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-ed67-a560-00000000001c] 44109 1727204238.31561: sending task result for task 028d2410-947f-ed67-a560-00000000001c 44109 1727204238.31656: done sending task result for task 028d2410-947f-ed67-a560-00000000001c 44109 1727204238.31658: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204238.31723: no more pending results, returning what we have 44109 1727204238.31727: results queue empty 44109 1727204238.31728: checking for any_errors_fatal 44109 1727204238.31734: done checking for any_errors_fatal 44109 1727204238.31734: checking for max_fail_percentage 44109 1727204238.31736: done checking for max_fail_percentage 44109 1727204238.31737: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.31737: done checking to see if all hosts have failed 44109 1727204238.31738: getting the remaining hosts for this loop 44109 1727204238.31739: done getting the remaining hosts for this loop 44109 1727204238.31743: getting the next task for host managed-node1 44109 1727204238.31748: done getting next task for host managed-node1 44109 1727204238.31752: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44109 1727204238.31755: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.31768: getting variables 44109 1727204238.31770: in VariableManager get_vars() 44109 1727204238.31807: Calling all_inventory to load vars for managed-node1 44109 1727204238.31810: Calling groups_inventory to load vars for managed-node1 44109 1727204238.31815: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.31823: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.31826: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.31828: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.32802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.33669: done with get_vars() 44109 1727204238.33687: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44109 1727204238.33744: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.064) 0:00:15.133 ***** 44109 1727204238.33766: entering _queue_task() for managed-node1/yum 44109 1727204238.33767: Creating lock for yum 44109 1727204238.34317: worker is 1 (out of 1 available) 44109 1727204238.34329: exiting _queue_task() for managed-node1/yum 44109 1727204238.34339: done queuing things up, now waiting for results queue to drain 44109 1727204238.34340: waiting for pending results... 44109 1727204238.34747: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44109 1727204238.35195: in run() - task 028d2410-947f-ed67-a560-00000000001d 44109 1727204238.35383: variable 'ansible_search_path' from source: unknown 44109 1727204238.35387: variable 'ansible_search_path' from source: unknown 44109 1727204238.35389: calling self._execute() 44109 1727204238.35520: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.35524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.35527: variable 'omit' from source: magic vars 44109 1727204238.36049: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.36071: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.36272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204238.38867: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204238.38959: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204238.39003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204238.39281: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204238.39285: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204238.39287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.39290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.39292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.39294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.39296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.39384: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.39414: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44109 1727204238.39427: when evaluation is False, skipping this task 44109 1727204238.39436: _execute() done 44109 1727204238.39444: dumping result to json 44109 1727204238.39450: done dumping result, returning 44109 1727204238.39460: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-ed67-a560-00000000001d] 44109 1727204238.39469: sending task result for task 028d2410-947f-ed67-a560-00000000001d skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44109 1727204238.39680: no more pending results, returning what we have 44109 1727204238.39684: results queue empty 44109 1727204238.39685: checking for any_errors_fatal 44109 1727204238.39692: done checking for any_errors_fatal 44109 1727204238.39693: checking for max_fail_percentage 44109 1727204238.39695: done checking for max_fail_percentage 44109 1727204238.39696: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.39696: done checking to see if all hosts have failed 44109 1727204238.39697: getting the remaining hosts for this loop 44109 1727204238.39698: done getting the remaining hosts for this loop 44109 1727204238.39702: getting the next task for host managed-node1 44109 1727204238.39709: done getting next task for host managed-node1 44109 1727204238.39715: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44109 1727204238.39718: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.39735: getting variables 44109 1727204238.39737: in VariableManager get_vars() 44109 1727204238.39779: Calling all_inventory to load vars for managed-node1 44109 1727204238.39782: Calling groups_inventory to load vars for managed-node1 44109 1727204238.39785: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.39795: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.39798: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.39801: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.40434: done sending task result for task 028d2410-947f-ed67-a560-00000000001d 44109 1727204238.40437: WORKER PROCESS EXITING 44109 1727204238.42000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.44151: done with get_vars() 44109 1727204238.44180: done getting variables 44109 1727204238.44239: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.105) 0:00:15.239 ***** 44109 1727204238.44273: entering _queue_task() for managed-node1/fail 44109 1727204238.44603: worker is 1 (out of 1 available) 44109 1727204238.44616: exiting _queue_task() for managed-node1/fail 44109 1727204238.44628: done queuing things up, now waiting for results queue to drain 44109 1727204238.44629: waiting for pending results... 44109 1727204238.44860: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44109 1727204238.45001: in run() - task 028d2410-947f-ed67-a560-00000000001e 44109 1727204238.45028: variable 'ansible_search_path' from source: unknown 44109 1727204238.45038: variable 'ansible_search_path' from source: unknown 44109 1727204238.45181: calling self._execute() 44109 1727204238.45185: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.45188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.45201: variable 'omit' from source: magic vars 44109 1727204238.45583: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.45601: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.45725: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204238.45928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204238.48068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204238.48439: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204238.48483: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204238.48522: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204238.48552: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204238.48644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.48681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.48715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.48762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.48784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.48835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.48980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.48984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.48987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.48989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.48993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.49020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.49047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.49089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.49108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.49258: variable 'network_connections' from source: task vars 44109 1727204238.49277: variable 'interface' from source: set_fact 44109 1727204238.49352: variable 'interface' from source: set_fact 44109 1727204238.49365: variable 'interface' from source: set_fact 44109 1727204238.49427: variable 'interface' from source: set_fact 44109 1727204238.49521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204238.49681: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204238.49720: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204238.49754: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204238.49788: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204238.49830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204238.49856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204238.49888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.49919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204238.49980: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204238.50279: variable 'network_connections' from source: task vars 44109 1727204238.50282: variable 'interface' from source: set_fact 44109 1727204238.50285: variable 'interface' from source: set_fact 44109 1727204238.50286: variable 'interface' from source: set_fact 44109 1727204238.50339: variable 'interface' from source: set_fact 44109 1727204238.50392: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204238.50399: when evaluation is False, skipping this task 44109 1727204238.50404: _execute() done 44109 1727204238.50410: dumping result to json 44109 1727204238.50415: done dumping result, returning 44109 1727204238.50424: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-ed67-a560-00000000001e] 44109 1727204238.50438: sending task result for task 028d2410-947f-ed67-a560-00000000001e skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204238.50601: no more pending results, returning what we have 44109 1727204238.50605: results queue empty 44109 1727204238.50605: checking for any_errors_fatal 44109 1727204238.50613: done checking for any_errors_fatal 44109 1727204238.50614: checking for max_fail_percentage 44109 1727204238.50616: done checking for max_fail_percentage 44109 1727204238.50617: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.50617: done checking to see if all hosts have failed 44109 1727204238.50618: getting the remaining hosts for this loop 44109 1727204238.50619: done getting the remaining hosts for this loop 44109 1727204238.50623: getting the next task for host managed-node1 44109 1727204238.50629: done getting next task for host managed-node1 44109 1727204238.50633: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44109 1727204238.50636: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.50654: getting variables 44109 1727204238.50656: in VariableManager get_vars() 44109 1727204238.50692: Calling all_inventory to load vars for managed-node1 44109 1727204238.50695: Calling groups_inventory to load vars for managed-node1 44109 1727204238.50697: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.50708: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.50710: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.50716: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.51288: done sending task result for task 028d2410-947f-ed67-a560-00000000001e 44109 1727204238.51292: WORKER PROCESS EXITING 44109 1727204238.54968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.56408: done with get_vars() 44109 1727204238.56439: done getting variables 44109 1727204238.56530: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.122) 0:00:15.361 ***** 44109 1727204238.56560: entering _queue_task() for managed-node1/package 44109 1727204238.56857: worker is 1 (out of 1 available) 44109 1727204238.56870: exiting _queue_task() for managed-node1/package 44109 1727204238.56885: done queuing things up, now waiting for results queue to drain 44109 1727204238.56887: waiting for pending results... 44109 1727204238.57078: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 44109 1727204238.57177: in run() - task 028d2410-947f-ed67-a560-00000000001f 44109 1727204238.57189: variable 'ansible_search_path' from source: unknown 44109 1727204238.57194: variable 'ansible_search_path' from source: unknown 44109 1727204238.57227: calling self._execute() 44109 1727204238.57300: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.57304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.57313: variable 'omit' from source: magic vars 44109 1727204238.57606: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.57615: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.57752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204238.57952: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204238.57986: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204238.58040: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204238.58066: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204238.58148: variable 'network_packages' from source: role '' defaults 44109 1727204238.58223: variable '__network_provider_setup' from source: role '' defaults 44109 1727204238.58232: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204238.58280: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204238.58288: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204238.58354: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204238.58508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204238.60582: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204238.60586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204238.60588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204238.60591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204238.60593: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204238.60661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.60698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.60732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.60780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.60800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.60851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.60880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.60908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.60955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.60974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.61228: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44109 1727204238.61329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.61345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.61362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.61389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.61399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.61473: variable 'ansible_python' from source: facts 44109 1727204238.61490: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44109 1727204238.61552: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204238.61608: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204238.61694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.61710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.61729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.61757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.61768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.61801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.61823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.61840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.61867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.61879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.61977: variable 'network_connections' from source: task vars 44109 1727204238.61982: variable 'interface' from source: set_fact 44109 1727204238.62054: variable 'interface' from source: set_fact 44109 1727204238.62062: variable 'interface' from source: set_fact 44109 1727204238.62133: variable 'interface' from source: set_fact 44109 1727204238.62200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204238.62222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204238.62242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.62262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204238.62301: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204238.62479: variable 'network_connections' from source: task vars 44109 1727204238.62483: variable 'interface' from source: set_fact 44109 1727204238.62555: variable 'interface' from source: set_fact 44109 1727204238.62562: variable 'interface' from source: set_fact 44109 1727204238.62633: variable 'interface' from source: set_fact 44109 1727204238.62684: variable '__network_packages_default_wireless' from source: role '' defaults 44109 1727204238.62741: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204238.62933: variable 'network_connections' from source: task vars 44109 1727204238.62939: variable 'interface' from source: set_fact 44109 1727204238.62984: variable 'interface' from source: set_fact 44109 1727204238.62989: variable 'interface' from source: set_fact 44109 1727204238.63035: variable 'interface' from source: set_fact 44109 1727204238.63067: variable '__network_packages_default_team' from source: role '' defaults 44109 1727204238.63126: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204238.63570: variable 'network_connections' from source: task vars 44109 1727204238.63573: variable 'interface' from source: set_fact 44109 1727204238.63576: variable 'interface' from source: set_fact 44109 1727204238.63578: variable 'interface' from source: set_fact 44109 1727204238.63618: variable 'interface' from source: set_fact 44109 1727204238.63708: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204238.63767: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204238.63782: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204238.63849: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204238.64083: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44109 1727204238.64415: variable 'network_connections' from source: task vars 44109 1727204238.64419: variable 'interface' from source: set_fact 44109 1727204238.64482: variable 'interface' from source: set_fact 44109 1727204238.64485: variable 'interface' from source: set_fact 44109 1727204238.64508: variable 'interface' from source: set_fact 44109 1727204238.64529: variable 'ansible_distribution' from source: facts 44109 1727204238.64532: variable '__network_rh_distros' from source: role '' defaults 44109 1727204238.64537: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.64555: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44109 1727204238.64660: variable 'ansible_distribution' from source: facts 44109 1727204238.64663: variable '__network_rh_distros' from source: role '' defaults 44109 1727204238.64669: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.64683: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44109 1727204238.64786: variable 'ansible_distribution' from source: facts 44109 1727204238.64789: variable '__network_rh_distros' from source: role '' defaults 44109 1727204238.64795: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.64821: variable 'network_provider' from source: set_fact 44109 1727204238.64832: variable 'ansible_facts' from source: unknown 44109 1727204238.65253: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44109 1727204238.65256: when evaluation is False, skipping this task 44109 1727204238.65259: _execute() done 44109 1727204238.65261: dumping result to json 44109 1727204238.65263: done dumping result, returning 44109 1727204238.65271: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-ed67-a560-00000000001f] 44109 1727204238.65276: sending task result for task 028d2410-947f-ed67-a560-00000000001f 44109 1727204238.65364: done sending task result for task 028d2410-947f-ed67-a560-00000000001f 44109 1727204238.65366: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44109 1727204238.65420: no more pending results, returning what we have 44109 1727204238.65424: results queue empty 44109 1727204238.65424: checking for any_errors_fatal 44109 1727204238.65431: done checking for any_errors_fatal 44109 1727204238.65432: checking for max_fail_percentage 44109 1727204238.65433: done checking for max_fail_percentage 44109 1727204238.65434: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.65435: done checking to see if all hosts have failed 44109 1727204238.65435: getting the remaining hosts for this loop 44109 1727204238.65438: done getting the remaining hosts for this loop 44109 1727204238.65441: getting the next task for host managed-node1 44109 1727204238.65447: done getting next task for host managed-node1 44109 1727204238.65451: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44109 1727204238.65453: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.65473: getting variables 44109 1727204238.65477: in VariableManager get_vars() 44109 1727204238.65516: Calling all_inventory to load vars for managed-node1 44109 1727204238.65519: Calling groups_inventory to load vars for managed-node1 44109 1727204238.65521: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.65532: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.65534: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.65536: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.66331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.67200: done with get_vars() 44109 1727204238.67220: done getting variables 44109 1727204238.67262: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.107) 0:00:15.469 ***** 44109 1727204238.67287: entering _queue_task() for managed-node1/package 44109 1727204238.67530: worker is 1 (out of 1 available) 44109 1727204238.67542: exiting _queue_task() for managed-node1/package 44109 1727204238.67553: done queuing things up, now waiting for results queue to drain 44109 1727204238.67554: waiting for pending results... 44109 1727204238.67736: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44109 1727204238.67830: in run() - task 028d2410-947f-ed67-a560-000000000020 44109 1727204238.67844: variable 'ansible_search_path' from source: unknown 44109 1727204238.67847: variable 'ansible_search_path' from source: unknown 44109 1727204238.67877: calling self._execute() 44109 1727204238.67947: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.67951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.67960: variable 'omit' from source: magic vars 44109 1727204238.68250: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.68259: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.68344: variable 'network_state' from source: role '' defaults 44109 1727204238.68353: Evaluated conditional (network_state != {}): False 44109 1727204238.68356: when evaluation is False, skipping this task 44109 1727204238.68359: _execute() done 44109 1727204238.68362: dumping result to json 44109 1727204238.68364: done dumping result, returning 44109 1727204238.68371: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-ed67-a560-000000000020] 44109 1727204238.68377: sending task result for task 028d2410-947f-ed67-a560-000000000020 44109 1727204238.68470: done sending task result for task 028d2410-947f-ed67-a560-000000000020 44109 1727204238.68472: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204238.68521: no more pending results, returning what we have 44109 1727204238.68524: results queue empty 44109 1727204238.68525: checking for any_errors_fatal 44109 1727204238.68533: done checking for any_errors_fatal 44109 1727204238.68534: checking for max_fail_percentage 44109 1727204238.68535: done checking for max_fail_percentage 44109 1727204238.68536: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.68537: done checking to see if all hosts have failed 44109 1727204238.68538: getting the remaining hosts for this loop 44109 1727204238.68539: done getting the remaining hosts for this loop 44109 1727204238.68542: getting the next task for host managed-node1 44109 1727204238.68550: done getting next task for host managed-node1 44109 1727204238.68553: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44109 1727204238.68556: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.68571: getting variables 44109 1727204238.68572: in VariableManager get_vars() 44109 1727204238.68609: Calling all_inventory to load vars for managed-node1 44109 1727204238.68612: Calling groups_inventory to load vars for managed-node1 44109 1727204238.68614: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.68622: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.68625: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.68627: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.69490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.70354: done with get_vars() 44109 1727204238.70371: done getting variables 44109 1727204238.70418: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.031) 0:00:15.500 ***** 44109 1727204238.70441: entering _queue_task() for managed-node1/package 44109 1727204238.70682: worker is 1 (out of 1 available) 44109 1727204238.70696: exiting _queue_task() for managed-node1/package 44109 1727204238.70709: done queuing things up, now waiting for results queue to drain 44109 1727204238.70710: waiting for pending results... 44109 1727204238.70889: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44109 1727204238.70973: in run() - task 028d2410-947f-ed67-a560-000000000021 44109 1727204238.70988: variable 'ansible_search_path' from source: unknown 44109 1727204238.70991: variable 'ansible_search_path' from source: unknown 44109 1727204238.71022: calling self._execute() 44109 1727204238.71093: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.71096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.71106: variable 'omit' from source: magic vars 44109 1727204238.71390: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.71399: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.71482: variable 'network_state' from source: role '' defaults 44109 1727204238.71491: Evaluated conditional (network_state != {}): False 44109 1727204238.71496: when evaluation is False, skipping this task 44109 1727204238.71498: _execute() done 44109 1727204238.71501: dumping result to json 44109 1727204238.71504: done dumping result, returning 44109 1727204238.71511: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-ed67-a560-000000000021] 44109 1727204238.71517: sending task result for task 028d2410-947f-ed67-a560-000000000021 44109 1727204238.71606: done sending task result for task 028d2410-947f-ed67-a560-000000000021 44109 1727204238.71608: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204238.71653: no more pending results, returning what we have 44109 1727204238.71657: results queue empty 44109 1727204238.71657: checking for any_errors_fatal 44109 1727204238.71664: done checking for any_errors_fatal 44109 1727204238.71665: checking for max_fail_percentage 44109 1727204238.71667: done checking for max_fail_percentage 44109 1727204238.71668: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.71668: done checking to see if all hosts have failed 44109 1727204238.71669: getting the remaining hosts for this loop 44109 1727204238.71670: done getting the remaining hosts for this loop 44109 1727204238.71673: getting the next task for host managed-node1 44109 1727204238.71682: done getting next task for host managed-node1 44109 1727204238.71685: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44109 1727204238.71688: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.71703: getting variables 44109 1727204238.71704: in VariableManager get_vars() 44109 1727204238.71737: Calling all_inventory to load vars for managed-node1 44109 1727204238.71740: Calling groups_inventory to load vars for managed-node1 44109 1727204238.71742: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.71750: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.71752: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.71754: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.72510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.73482: done with get_vars() 44109 1727204238.73497: done getting variables 44109 1727204238.73572: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.031) 0:00:15.532 ***** 44109 1727204238.73596: entering _queue_task() for managed-node1/service 44109 1727204238.73597: Creating lock for service 44109 1727204238.73834: worker is 1 (out of 1 available) 44109 1727204238.73847: exiting _queue_task() for managed-node1/service 44109 1727204238.73858: done queuing things up, now waiting for results queue to drain 44109 1727204238.73859: waiting for pending results... 44109 1727204238.74039: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44109 1727204238.74132: in run() - task 028d2410-947f-ed67-a560-000000000022 44109 1727204238.74144: variable 'ansible_search_path' from source: unknown 44109 1727204238.74147: variable 'ansible_search_path' from source: unknown 44109 1727204238.74176: calling self._execute() 44109 1727204238.74248: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.74252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.74260: variable 'omit' from source: magic vars 44109 1727204238.74544: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.74553: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.74641: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204238.74769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204238.76260: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204238.76310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204238.76341: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204238.76366: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204238.76392: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204238.76448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.76468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.76495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.76523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.76534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.76566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.76681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.76687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.76689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.76691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.76693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.76696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.76698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.76714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.76723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.76835: variable 'network_connections' from source: task vars 44109 1727204238.76845: variable 'interface' from source: set_fact 44109 1727204238.76896: variable 'interface' from source: set_fact 44109 1727204238.76904: variable 'interface' from source: set_fact 44109 1727204238.76949: variable 'interface' from source: set_fact 44109 1727204238.77014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204238.77131: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204238.77159: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204238.77183: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204238.77203: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204238.77234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204238.77250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204238.77269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.77288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204238.77333: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204238.77478: variable 'network_connections' from source: task vars 44109 1727204238.77482: variable 'interface' from source: set_fact 44109 1727204238.77526: variable 'interface' from source: set_fact 44109 1727204238.77531: variable 'interface' from source: set_fact 44109 1727204238.77579: variable 'interface' from source: set_fact 44109 1727204238.77614: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204238.77618: when evaluation is False, skipping this task 44109 1727204238.77620: _execute() done 44109 1727204238.77623: dumping result to json 44109 1727204238.77625: done dumping result, returning 44109 1727204238.77631: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-ed67-a560-000000000022] 44109 1727204238.77641: sending task result for task 028d2410-947f-ed67-a560-000000000022 44109 1727204238.77720: done sending task result for task 028d2410-947f-ed67-a560-000000000022 44109 1727204238.77722: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204238.77767: no more pending results, returning what we have 44109 1727204238.77770: results queue empty 44109 1727204238.77771: checking for any_errors_fatal 44109 1727204238.77782: done checking for any_errors_fatal 44109 1727204238.77783: checking for max_fail_percentage 44109 1727204238.77785: done checking for max_fail_percentage 44109 1727204238.77785: checking to see if all hosts have failed and the running result is not ok 44109 1727204238.77786: done checking to see if all hosts have failed 44109 1727204238.77787: getting the remaining hosts for this loop 44109 1727204238.77788: done getting the remaining hosts for this loop 44109 1727204238.77791: getting the next task for host managed-node1 44109 1727204238.77798: done getting next task for host managed-node1 44109 1727204238.77801: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44109 1727204238.77804: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204238.77821: getting variables 44109 1727204238.77823: in VariableManager get_vars() 44109 1727204238.77860: Calling all_inventory to load vars for managed-node1 44109 1727204238.77863: Calling groups_inventory to load vars for managed-node1 44109 1727204238.77865: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204238.77874: Calling all_plugins_play to load vars for managed-node1 44109 1727204238.77878: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204238.77881: Calling groups_plugins_play to load vars for managed-node1 44109 1727204238.78688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204238.79541: done with get_vars() 44109 1727204238.79559: done getting variables 44109 1727204238.79604: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.060) 0:00:15.592 ***** 44109 1727204238.79629: entering _queue_task() for managed-node1/service 44109 1727204238.79869: worker is 1 (out of 1 available) 44109 1727204238.79884: exiting _queue_task() for managed-node1/service 44109 1727204238.79897: done queuing things up, now waiting for results queue to drain 44109 1727204238.79897: waiting for pending results... 44109 1727204238.80071: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44109 1727204238.80158: in run() - task 028d2410-947f-ed67-a560-000000000023 44109 1727204238.80171: variable 'ansible_search_path' from source: unknown 44109 1727204238.80175: variable 'ansible_search_path' from source: unknown 44109 1727204238.80205: calling self._execute() 44109 1727204238.80278: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.80282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.80292: variable 'omit' from source: magic vars 44109 1727204238.80569: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.80579: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204238.80695: variable 'network_provider' from source: set_fact 44109 1727204238.80699: variable 'network_state' from source: role '' defaults 44109 1727204238.80709: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44109 1727204238.80718: variable 'omit' from source: magic vars 44109 1727204238.80756: variable 'omit' from source: magic vars 44109 1727204238.80779: variable 'network_service_name' from source: role '' defaults 44109 1727204238.80837: variable 'network_service_name' from source: role '' defaults 44109 1727204238.80909: variable '__network_provider_setup' from source: role '' defaults 44109 1727204238.80916: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204238.80960: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204238.80967: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204238.81013: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204238.81160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204238.83388: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204238.83484: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204238.83488: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204238.83514: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204238.83548: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204238.83645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.83685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.83724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.83777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.83800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.83863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.83946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.83950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.83981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.84003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.84253: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44109 1727204238.84384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.84415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.84444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.84681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.84685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.84687: variable 'ansible_python' from source: facts 44109 1727204238.84690: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44109 1727204238.84728: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204238.84820: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204238.84963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.84999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.85039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.85087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.85110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.85172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204238.85215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204238.85255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.85303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204238.85325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204238.85487: variable 'network_connections' from source: task vars 44109 1727204238.85502: variable 'interface' from source: set_fact 44109 1727204238.85685: variable 'interface' from source: set_fact 44109 1727204238.85688: variable 'interface' from source: set_fact 44109 1727204238.85690: variable 'interface' from source: set_fact 44109 1727204238.85931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204238.86605: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204238.86668: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204238.86721: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204238.86773: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204238.86886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204238.86892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204238.86928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204238.86967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204238.87027: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204238.87368: variable 'network_connections' from source: task vars 44109 1727204238.87384: variable 'interface' from source: set_fact 44109 1727204238.87481: variable 'interface' from source: set_fact 44109 1727204238.87539: variable 'interface' from source: set_fact 44109 1727204238.87578: variable 'interface' from source: set_fact 44109 1727204238.87866: variable '__network_packages_default_wireless' from source: role '' defaults 44109 1727204238.87870: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204238.88170: variable 'network_connections' from source: task vars 44109 1727204238.88186: variable 'interface' from source: set_fact 44109 1727204238.88281: variable 'interface' from source: set_fact 44109 1727204238.88284: variable 'interface' from source: set_fact 44109 1727204238.88354: variable 'interface' from source: set_fact 44109 1727204238.88414: variable '__network_packages_default_team' from source: role '' defaults 44109 1727204238.88522: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204238.88810: variable 'network_connections' from source: task vars 44109 1727204238.88821: variable 'interface' from source: set_fact 44109 1727204238.88901: variable 'interface' from source: set_fact 44109 1727204238.88958: variable 'interface' from source: set_fact 44109 1727204238.88993: variable 'interface' from source: set_fact 44109 1727204238.89089: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204238.89152: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204238.89164: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204238.89232: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204238.89455: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44109 1727204238.89985: variable 'network_connections' from source: task vars 44109 1727204238.90047: variable 'interface' from source: set_fact 44109 1727204238.90063: variable 'interface' from source: set_fact 44109 1727204238.90074: variable 'interface' from source: set_fact 44109 1727204238.90136: variable 'interface' from source: set_fact 44109 1727204238.90182: variable 'ansible_distribution' from source: facts 44109 1727204238.90192: variable '__network_rh_distros' from source: role '' defaults 44109 1727204238.90203: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.90231: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44109 1727204238.90420: variable 'ansible_distribution' from source: facts 44109 1727204238.90482: variable '__network_rh_distros' from source: role '' defaults 44109 1727204238.90485: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.90488: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44109 1727204238.90642: variable 'ansible_distribution' from source: facts 44109 1727204238.90652: variable '__network_rh_distros' from source: role '' defaults 44109 1727204238.90662: variable 'ansible_distribution_major_version' from source: facts 44109 1727204238.90714: variable 'network_provider' from source: set_fact 44109 1727204238.90742: variable 'omit' from source: magic vars 44109 1727204238.90808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204238.90811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204238.90836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204238.90856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204238.90870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204238.90907: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204238.90928: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.90981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.91051: Set connection var ansible_connection to ssh 44109 1727204238.91062: Set connection var ansible_timeout to 10 44109 1727204238.91073: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204238.91088: Set connection var ansible_pipelining to False 44109 1727204238.91099: Set connection var ansible_shell_executable to /bin/sh 44109 1727204238.91109: Set connection var ansible_shell_type to sh 44109 1727204238.91142: variable 'ansible_shell_executable' from source: unknown 44109 1727204238.91154: variable 'ansible_connection' from source: unknown 44109 1727204238.91162: variable 'ansible_module_compression' from source: unknown 44109 1727204238.91241: variable 'ansible_shell_type' from source: unknown 44109 1727204238.91244: variable 'ansible_shell_executable' from source: unknown 44109 1727204238.91247: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204238.91255: variable 'ansible_pipelining' from source: unknown 44109 1727204238.91258: variable 'ansible_timeout' from source: unknown 44109 1727204238.91260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204238.91322: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204238.91338: variable 'omit' from source: magic vars 44109 1727204238.91348: starting attempt loop 44109 1727204238.91355: running the handler 44109 1727204238.91485: variable 'ansible_facts' from source: unknown 44109 1727204238.92211: _low_level_execute_command(): starting 44109 1727204238.92224: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204238.92947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204238.92965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204238.93019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204238.93127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204238.93180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204238.93271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204238.95082: stdout chunk (state=3): >>>/root <<< 44109 1727204238.95344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204238.95347: stdout chunk (state=3): >>><<< 44109 1727204238.95349: stderr chunk (state=3): >>><<< 44109 1727204238.95352: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204238.95354: _low_level_execute_command(): starting 44109 1727204238.95357: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887 `" && echo ansible-tmp-1727204238.952494-45727-54789428942887="` echo /root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887 `" ) && sleep 0' 44109 1727204238.95888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204238.95909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204238.96016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204238.96036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204238.96057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204238.96073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204238.96190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204238.98339: stdout chunk (state=3): >>>ansible-tmp-1727204238.952494-45727-54789428942887=/root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887 <<< 44109 1727204238.98507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204238.98510: stdout chunk (state=3): >>><<< 44109 1727204238.98516: stderr chunk (state=3): >>><<< 44109 1727204238.98542: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204238.952494-45727-54789428942887=/root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204238.98585: variable 'ansible_module_compression' from source: unknown 44109 1727204238.98648: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 44109 1727204238.98785: ANSIBALLZ: Acquiring lock 44109 1727204238.98788: ANSIBALLZ: Lock acquired: 139907468546112 44109 1727204238.98791: ANSIBALLZ: Creating module 44109 1727204239.33116: ANSIBALLZ: Writing module into payload 44109 1727204239.33683: ANSIBALLZ: Writing module 44109 1727204239.33690: ANSIBALLZ: Renaming module 44109 1727204239.33692: ANSIBALLZ: Done creating module 44109 1727204239.33694: variable 'ansible_facts' from source: unknown 44109 1727204239.34081: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/AnsiballZ_systemd.py 44109 1727204239.34520: Sending initial data 44109 1727204239.34525: Sent initial data (154 bytes) 44109 1727204239.35626: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204239.35897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204239.36026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204239.38006: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/AnsiballZ_systemd.py" <<< 44109 1727204239.38010: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp70pwz1j1 /root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/AnsiballZ_systemd.py <<< 44109 1727204239.38235: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp70pwz1j1" to remote "/root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/AnsiballZ_systemd.py" <<< 44109 1727204239.41471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204239.41478: stdout chunk (state=3): >>><<< 44109 1727204239.41485: stderr chunk (state=3): >>><<< 44109 1727204239.41528: done transferring module to remote 44109 1727204239.41554: _low_level_execute_command(): starting 44109 1727204239.41557: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/ /root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/AnsiballZ_systemd.py && sleep 0' 44109 1727204239.42784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204239.42794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204239.42810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204239.42824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204239.42839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204239.42842: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204239.42853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204239.42867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204239.42878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204239.43090: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204239.43098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204239.43210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204239.45292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204239.45296: stdout chunk (state=3): >>><<< 44109 1727204239.45302: stderr chunk (state=3): >>><<< 44109 1727204239.45393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204239.45398: _low_level_execute_command(): starting 44109 1727204239.45401: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/AnsiballZ_systemd.py && sleep 0' 44109 1727204239.46792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204239.46795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204239.46798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204239.46800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204239.46802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204239.46803: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204239.46809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204239.46891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204239.46896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204239.46931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204239.47314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204239.78423: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10727424", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3298058240", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1755864000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 44109 1727204239.78451: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44109 1727204239.80966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204239.80970: stdout chunk (state=3): >>><<< 44109 1727204239.80979: stderr chunk (state=3): >>><<< 44109 1727204239.80997: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10727424", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3298058240", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1755864000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204239.81383: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204239.81386: _low_level_execute_command(): starting 44109 1727204239.81389: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204238.952494-45727-54789428942887/ > /dev/null 2>&1 && sleep 0' 44109 1727204239.82425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204239.82447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204239.82462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204239.82486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204239.82569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204239.82627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204239.82790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204239.82802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204239.83008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204239.85234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204239.85242: stdout chunk (state=3): >>><<< 44109 1727204239.85251: stderr chunk (state=3): >>><<< 44109 1727204239.85268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204239.85291: handler run complete 44109 1727204239.85682: attempt loop complete, returning result 44109 1727204239.85685: _execute() done 44109 1727204239.85688: dumping result to json 44109 1727204239.85690: done dumping result, returning 44109 1727204239.85692: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-ed67-a560-000000000023] 44109 1727204239.85694: sending task result for task 028d2410-947f-ed67-a560-000000000023 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204239.86229: no more pending results, returning what we have 44109 1727204239.86232: results queue empty 44109 1727204239.86233: checking for any_errors_fatal 44109 1727204239.86240: done checking for any_errors_fatal 44109 1727204239.86241: checking for max_fail_percentage 44109 1727204239.86243: done checking for max_fail_percentage 44109 1727204239.86244: checking to see if all hosts have failed and the running result is not ok 44109 1727204239.86244: done checking to see if all hosts have failed 44109 1727204239.86245: getting the remaining hosts for this loop 44109 1727204239.86247: done getting the remaining hosts for this loop 44109 1727204239.86250: getting the next task for host managed-node1 44109 1727204239.86257: done getting next task for host managed-node1 44109 1727204239.86260: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44109 1727204239.86263: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204239.86279: getting variables 44109 1727204239.86281: in VariableManager get_vars() 44109 1727204239.86318: Calling all_inventory to load vars for managed-node1 44109 1727204239.86321: Calling groups_inventory to load vars for managed-node1 44109 1727204239.86323: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204239.86333: Calling all_plugins_play to load vars for managed-node1 44109 1727204239.86335: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204239.86338: Calling groups_plugins_play to load vars for managed-node1 44109 1727204239.87186: done sending task result for task 028d2410-947f-ed67-a560-000000000023 44109 1727204239.87190: WORKER PROCESS EXITING 44109 1727204239.89474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204239.92584: done with get_vars() 44109 1727204239.92613: done getting variables 44109 1727204239.92674: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:19 -0400 (0:00:01.132) 0:00:16.725 ***** 44109 1727204239.92914: entering _queue_task() for managed-node1/service 44109 1727204239.93361: worker is 1 (out of 1 available) 44109 1727204239.93678: exiting _queue_task() for managed-node1/service 44109 1727204239.93689: done queuing things up, now waiting for results queue to drain 44109 1727204239.93691: waiting for pending results... 44109 1727204239.93989: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44109 1727204239.94250: in run() - task 028d2410-947f-ed67-a560-000000000024 44109 1727204239.94266: variable 'ansible_search_path' from source: unknown 44109 1727204239.94270: variable 'ansible_search_path' from source: unknown 44109 1727204239.94310: calling self._execute() 44109 1727204239.94592: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204239.94599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204239.94610: variable 'omit' from source: magic vars 44109 1727204239.95787: variable 'ansible_distribution_major_version' from source: facts 44109 1727204239.95799: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204239.96246: variable 'network_provider' from source: set_fact 44109 1727204239.96250: Evaluated conditional (network_provider == "nm"): True 44109 1727204239.96563: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204239.96844: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204239.97506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204240.02093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204240.02097: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204240.02219: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204240.02273: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204240.02374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204240.02594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204240.02675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204240.02783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204240.02869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204240.02895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204240.02999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204240.03092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204240.03130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204240.03230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204240.03299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204240.03409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204240.03443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204240.03544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204240.03652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204240.03656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204240.03964: variable 'network_connections' from source: task vars 44109 1727204240.03990: variable 'interface' from source: set_fact 44109 1727204240.04193: variable 'interface' from source: set_fact 44109 1727204240.04208: variable 'interface' from source: set_fact 44109 1727204240.04585: variable 'interface' from source: set_fact 44109 1727204240.04616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204240.04996: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204240.05046: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204240.05081: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204240.05158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204240.05351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204240.05383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204240.05416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204240.05545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204240.05604: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204240.06159: variable 'network_connections' from source: task vars 44109 1727204240.06228: variable 'interface' from source: set_fact 44109 1727204240.06443: variable 'interface' from source: set_fact 44109 1727204240.06519: variable 'interface' from source: set_fact 44109 1727204240.06522: variable 'interface' from source: set_fact 44109 1727204240.06699: Evaluated conditional (__network_wpa_supplicant_required): False 44109 1727204240.06708: when evaluation is False, skipping this task 44109 1727204240.06717: _execute() done 44109 1727204240.06768: dumping result to json 44109 1727204240.06779: done dumping result, returning 44109 1727204240.06793: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-ed67-a560-000000000024] 44109 1727204240.06873: sending task result for task 028d2410-947f-ed67-a560-000000000024 44109 1727204240.07200: done sending task result for task 028d2410-947f-ed67-a560-000000000024 44109 1727204240.07204: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44109 1727204240.07250: no more pending results, returning what we have 44109 1727204240.07254: results queue empty 44109 1727204240.07255: checking for any_errors_fatal 44109 1727204240.07277: done checking for any_errors_fatal 44109 1727204240.07278: checking for max_fail_percentage 44109 1727204240.07280: done checking for max_fail_percentage 44109 1727204240.07281: checking to see if all hosts have failed and the running result is not ok 44109 1727204240.07282: done checking to see if all hosts have failed 44109 1727204240.07283: getting the remaining hosts for this loop 44109 1727204240.07285: done getting the remaining hosts for this loop 44109 1727204240.07289: getting the next task for host managed-node1 44109 1727204240.07297: done getting next task for host managed-node1 44109 1727204240.07301: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44109 1727204240.07305: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204240.07320: getting variables 44109 1727204240.07322: in VariableManager get_vars() 44109 1727204240.07359: Calling all_inventory to load vars for managed-node1 44109 1727204240.07362: Calling groups_inventory to load vars for managed-node1 44109 1727204240.07364: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204240.07374: Calling all_plugins_play to load vars for managed-node1 44109 1727204240.07581: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204240.07586: Calling groups_plugins_play to load vars for managed-node1 44109 1727204240.10096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204240.13292: done with get_vars() 44109 1727204240.13326: done getting variables 44109 1727204240.13596: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.207) 0:00:16.932 ***** 44109 1727204240.13632: entering _queue_task() for managed-node1/service 44109 1727204240.14411: worker is 1 (out of 1 available) 44109 1727204240.14421: exiting _queue_task() for managed-node1/service 44109 1727204240.14431: done queuing things up, now waiting for results queue to drain 44109 1727204240.14432: waiting for pending results... 44109 1727204240.14779: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 44109 1727204240.15182: in run() - task 028d2410-947f-ed67-a560-000000000025 44109 1727204240.15191: variable 'ansible_search_path' from source: unknown 44109 1727204240.15195: variable 'ansible_search_path' from source: unknown 44109 1727204240.15381: calling self._execute() 44109 1727204240.15560: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204240.15563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204240.15565: variable 'omit' from source: magic vars 44109 1727204240.16349: variable 'ansible_distribution_major_version' from source: facts 44109 1727204240.16643: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204240.16700: variable 'network_provider' from source: set_fact 44109 1727204240.16715: Evaluated conditional (network_provider == "initscripts"): False 44109 1727204240.16724: when evaluation is False, skipping this task 44109 1727204240.16756: _execute() done 44109 1727204240.16764: dumping result to json 44109 1727204240.16771: done dumping result, returning 44109 1727204240.16784: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-ed67-a560-000000000025] 44109 1727204240.16968: sending task result for task 028d2410-947f-ed67-a560-000000000025 44109 1727204240.17046: done sending task result for task 028d2410-947f-ed67-a560-000000000025 44109 1727204240.17049: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204240.17113: no more pending results, returning what we have 44109 1727204240.17117: results queue empty 44109 1727204240.17118: checking for any_errors_fatal 44109 1727204240.17126: done checking for any_errors_fatal 44109 1727204240.17127: checking for max_fail_percentage 44109 1727204240.17129: done checking for max_fail_percentage 44109 1727204240.17130: checking to see if all hosts have failed and the running result is not ok 44109 1727204240.17131: done checking to see if all hosts have failed 44109 1727204240.17132: getting the remaining hosts for this loop 44109 1727204240.17133: done getting the remaining hosts for this loop 44109 1727204240.17137: getting the next task for host managed-node1 44109 1727204240.17145: done getting next task for host managed-node1 44109 1727204240.17149: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44109 1727204240.17153: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204240.17171: getting variables 44109 1727204240.17173: in VariableManager get_vars() 44109 1727204240.17212: Calling all_inventory to load vars for managed-node1 44109 1727204240.17215: Calling groups_inventory to load vars for managed-node1 44109 1727204240.17217: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204240.17228: Calling all_plugins_play to load vars for managed-node1 44109 1727204240.17231: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204240.17233: Calling groups_plugins_play to load vars for managed-node1 44109 1727204240.20349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204240.23621: done with get_vars() 44109 1727204240.23652: done getting variables 44109 1727204240.23719: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.103) 0:00:17.036 ***** 44109 1727204240.24032: entering _queue_task() for managed-node1/copy 44109 1727204240.24362: worker is 1 (out of 1 available) 44109 1727204240.24373: exiting _queue_task() for managed-node1/copy 44109 1727204240.24787: done queuing things up, now waiting for results queue to drain 44109 1727204240.24789: waiting for pending results... 44109 1727204240.25043: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44109 1727204240.25494: in run() - task 028d2410-947f-ed67-a560-000000000026 44109 1727204240.25498: variable 'ansible_search_path' from source: unknown 44109 1727204240.25500: variable 'ansible_search_path' from source: unknown 44109 1727204240.25502: calling self._execute() 44109 1727204240.25639: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204240.25653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204240.25881: variable 'omit' from source: magic vars 44109 1727204240.26572: variable 'ansible_distribution_major_version' from source: facts 44109 1727204240.26683: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204240.26706: variable 'network_provider' from source: set_fact 44109 1727204240.26798: Evaluated conditional (network_provider == "initscripts"): False 44109 1727204240.26805: when evaluation is False, skipping this task 44109 1727204240.26811: _execute() done 44109 1727204240.26817: dumping result to json 44109 1727204240.26823: done dumping result, returning 44109 1727204240.26833: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-ed67-a560-000000000026] 44109 1727204240.26840: sending task result for task 028d2410-947f-ed67-a560-000000000026 skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44109 1727204240.26992: no more pending results, returning what we have 44109 1727204240.26996: results queue empty 44109 1727204240.26997: checking for any_errors_fatal 44109 1727204240.27005: done checking for any_errors_fatal 44109 1727204240.27006: checking for max_fail_percentage 44109 1727204240.27008: done checking for max_fail_percentage 44109 1727204240.27009: checking to see if all hosts have failed and the running result is not ok 44109 1727204240.27010: done checking to see if all hosts have failed 44109 1727204240.27011: getting the remaining hosts for this loop 44109 1727204240.27012: done getting the remaining hosts for this loop 44109 1727204240.27016: getting the next task for host managed-node1 44109 1727204240.27024: done getting next task for host managed-node1 44109 1727204240.27028: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44109 1727204240.27032: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204240.27051: getting variables 44109 1727204240.27053: in VariableManager get_vars() 44109 1727204240.27095: Calling all_inventory to load vars for managed-node1 44109 1727204240.27098: Calling groups_inventory to load vars for managed-node1 44109 1727204240.27101: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204240.27113: Calling all_plugins_play to load vars for managed-node1 44109 1727204240.27116: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204240.27119: Calling groups_plugins_play to load vars for managed-node1 44109 1727204240.28445: done sending task result for task 028d2410-947f-ed67-a560-000000000026 44109 1727204240.28449: WORKER PROCESS EXITING 44109 1727204240.30099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204240.32020: done with get_vars() 44109 1727204240.32043: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.081) 0:00:17.117 ***** 44109 1727204240.32138: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44109 1727204240.32140: Creating lock for fedora.linux_system_roles.network_connections 44109 1727204240.32497: worker is 1 (out of 1 available) 44109 1727204240.32511: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44109 1727204240.32523: done queuing things up, now waiting for results queue to drain 44109 1727204240.32524: waiting for pending results... 44109 1727204240.32925: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44109 1727204240.33119: in run() - task 028d2410-947f-ed67-a560-000000000027 44109 1727204240.33140: variable 'ansible_search_path' from source: unknown 44109 1727204240.33184: variable 'ansible_search_path' from source: unknown 44109 1727204240.33233: calling self._execute() 44109 1727204240.33403: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204240.33495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204240.33512: variable 'omit' from source: magic vars 44109 1727204240.33935: variable 'ansible_distribution_major_version' from source: facts 44109 1727204240.33950: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204240.33962: variable 'omit' from source: magic vars 44109 1727204240.34029: variable 'omit' from source: magic vars 44109 1727204240.34198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204240.36553: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204240.36635: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204240.36681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204240.36720: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204240.36757: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204240.36852: variable 'network_provider' from source: set_fact 44109 1727204240.36980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204240.37028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204240.37118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204240.37121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204240.37123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204240.37180: variable 'omit' from source: magic vars 44109 1727204240.37302: variable 'omit' from source: magic vars 44109 1727204240.37409: variable 'network_connections' from source: task vars 44109 1727204240.37424: variable 'interface' from source: set_fact 44109 1727204240.37496: variable 'interface' from source: set_fact 44109 1727204240.37509: variable 'interface' from source: set_fact 44109 1727204240.37578: variable 'interface' from source: set_fact 44109 1727204240.37932: variable 'omit' from source: magic vars 44109 1727204240.37945: variable '__lsr_ansible_managed' from source: task vars 44109 1727204240.38080: variable '__lsr_ansible_managed' from source: task vars 44109 1727204240.38288: Loaded config def from plugin (lookup/template) 44109 1727204240.38299: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44109 1727204240.38361: File lookup term: get_ansible_managed.j2 44109 1727204240.38369: variable 'ansible_search_path' from source: unknown 44109 1727204240.38381: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44109 1727204240.38397: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44109 1727204240.38424: variable 'ansible_search_path' from source: unknown 44109 1727204240.46789: variable 'ansible_managed' from source: unknown 44109 1727204240.46954: variable 'omit' from source: magic vars 44109 1727204240.47136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204240.47140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204240.47142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204240.47144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204240.47146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204240.47271: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204240.47284: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204240.47301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204240.47584: Set connection var ansible_connection to ssh 44109 1727204240.47587: Set connection var ansible_timeout to 10 44109 1727204240.47589: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204240.47591: Set connection var ansible_pipelining to False 44109 1727204240.47593: Set connection var ansible_shell_executable to /bin/sh 44109 1727204240.47595: Set connection var ansible_shell_type to sh 44109 1727204240.47694: variable 'ansible_shell_executable' from source: unknown 44109 1727204240.47709: variable 'ansible_connection' from source: unknown 44109 1727204240.47738: variable 'ansible_module_compression' from source: unknown 44109 1727204240.47746: variable 'ansible_shell_type' from source: unknown 44109 1727204240.47755: variable 'ansible_shell_executable' from source: unknown 44109 1727204240.47762: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204240.47770: variable 'ansible_pipelining' from source: unknown 44109 1727204240.47779: variable 'ansible_timeout' from source: unknown 44109 1727204240.47794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204240.47938: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204240.47961: variable 'omit' from source: magic vars 44109 1727204240.47978: starting attempt loop 44109 1727204240.47987: running the handler 44109 1727204240.48014: _low_level_execute_command(): starting 44109 1727204240.48028: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204240.48747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204240.48761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204240.48883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204240.48899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204240.49017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204240.50773: stdout chunk (state=3): >>>/root <<< 44109 1727204240.50982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204240.50985: stdout chunk (state=3): >>><<< 44109 1727204240.50987: stderr chunk (state=3): >>><<< 44109 1727204240.50990: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204240.50993: _low_level_execute_command(): starting 44109 1727204240.50996: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810 `" && echo ansible-tmp-1727204240.5095193-45931-278325182339810="` echo /root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810 `" ) && sleep 0' 44109 1727204240.51631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204240.51646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204240.51657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204240.51671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204240.51686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204240.51746: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204240.51749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204240.51752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204240.51801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204240.51823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204240.51851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204240.51952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204240.54057: stdout chunk (state=3): >>>ansible-tmp-1727204240.5095193-45931-278325182339810=/root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810 <<< 44109 1727204240.54240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204240.54243: stdout chunk (state=3): >>><<< 44109 1727204240.54300: stderr chunk (state=3): >>><<< 44109 1727204240.54346: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204240.5095193-45931-278325182339810=/root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204240.54480: variable 'ansible_module_compression' from source: unknown 44109 1727204240.54649: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 44109 1727204240.54881: ANSIBALLZ: Acquiring lock 44109 1727204240.54884: ANSIBALLZ: Lock acquired: 139907469369056 44109 1727204240.54886: ANSIBALLZ: Creating module 44109 1727204240.86636: ANSIBALLZ: Writing module into payload 44109 1727204240.87223: ANSIBALLZ: Writing module 44109 1727204240.87260: ANSIBALLZ: Renaming module 44109 1727204240.87273: ANSIBALLZ: Done creating module 44109 1727204240.87344: variable 'ansible_facts' from source: unknown 44109 1727204240.87423: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/AnsiballZ_network_connections.py 44109 1727204240.87582: Sending initial data 44109 1727204240.87684: Sent initial data (168 bytes) 44109 1727204240.88253: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204240.88269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204240.88285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204240.88339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204240.88408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204240.88428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204240.88464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204240.88690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204240.90441: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204240.90651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204240.90723: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp1y1_iqyp /root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/AnsiballZ_network_connections.py <<< 44109 1727204240.90727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/AnsiballZ_network_connections.py" <<< 44109 1727204240.90817: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp1y1_iqyp" to remote "/root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/AnsiballZ_network_connections.py" <<< 44109 1727204240.92952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204240.92957: stdout chunk (state=3): >>><<< 44109 1727204240.92960: stderr chunk (state=3): >>><<< 44109 1727204240.92989: done transferring module to remote 44109 1727204240.93005: _low_level_execute_command(): starting 44109 1727204240.93014: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/ /root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/AnsiballZ_network_connections.py && sleep 0' 44109 1727204240.93747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204240.93762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204240.93780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204240.93838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204240.93853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204240.93954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204240.93994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204240.94086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204240.96086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204240.96090: stdout chunk (state=3): >>><<< 44109 1727204240.96092: stderr chunk (state=3): >>><<< 44109 1727204240.96108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204240.96194: _low_level_execute_command(): starting 44109 1727204240.96198: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/AnsiballZ_network_connections.py && sleep 0' 44109 1727204240.96661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204240.96665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204240.96694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204240.96698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204240.96705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204240.96719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204240.96760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204240.96763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204240.96854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204241.43883: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44109 1727204241.45607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204241.45627: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 44109 1727204241.45685: stderr chunk (state=3): >>><<< 44109 1727204241.45705: stdout chunk (state=3): >>><<< 44109 1727204241.45743: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204241.45922: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26', '2001:db8::2/32'], 'route': [{'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '2001:db8::4', 'prefix': 32, 'gateway': '2001:db8::1', 'metric': 2, 'table': 30600}], 'routing_rule': [{'priority': 30200, 'from': '198.51.100.58/26', 'table': 30200}, {'priority': 30201, 'family': 'ipv4', 'fwmark': 1, 'fwmask': 1, 'table': 30200}, {'priority': 30202, 'family': 'ipv4', 'ipproto': 6, 'table': 30200}, {'priority': 30203, 'family': 'ipv4', 'sport': '128 - 256', 'table': 30200}, {'priority': 30204, 'family': 'ipv4', 'tos': 8, 'table': 30200}, {'priority': 30400, 'to': '198.51.100.128/26', 'table': 30400}, {'priority': 30401, 'family': 'ipv4', 'iif': 'iiftest', 'table': 30400}, {'priority': 30402, 'family': 'ipv4', 'oif': 'oiftest', 'table': 30400}, {'priority': 30403, 'from': '0.0.0.0/0', 'to': '0.0.0.0/0', 'table': 30400}, {'priority': 30600, 'to': '2001:db8::4/32', 'table': 30600}, {'priority': 30601, 'family': 'ipv6', 'dport': '128 - 256', 'invert': True, 'table': 30600}, {'priority': 30602, 'from': '::/0', 'to': '::/0', 'table': 30600}, {'priority': 200, 'from': '198.51.100.56/26', 'table': 'custom'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204241.45957: _low_level_execute_command(): starting 44109 1727204241.45966: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204240.5095193-45931-278325182339810/ > /dev/null 2>&1 && sleep 0' 44109 1727204241.46803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204241.46842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204241.46859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204241.46891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204241.46928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204241.46950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204241.47050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204241.47123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204241.47153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204241.47214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204241.47313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204241.49503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204241.49508: stdout chunk (state=3): >>><<< 44109 1727204241.49510: stderr chunk (state=3): >>><<< 44109 1727204241.49610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204241.49616: handler run complete 44109 1727204241.50001: attempt loop complete, returning result 44109 1727204241.50004: _execute() done 44109 1727204241.50006: dumping result to json 44109 1727204241.50008: done dumping result, returning 44109 1727204241.50010: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-ed67-a560-000000000027] 44109 1727204241.50015: sending task result for task 028d2410-947f-ed67-a560-000000000027 44109 1727204241.50679: done sending task result for task 028d2410-947f-ed67-a560-000000000027 44109 1727204241.50683: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664 (not-active) 44109 1727204241.51664: no more pending results, returning what we have 44109 1727204241.51671: results queue empty 44109 1727204241.51801: checking for any_errors_fatal 44109 1727204241.51808: done checking for any_errors_fatal 44109 1727204241.51809: checking for max_fail_percentage 44109 1727204241.51810: done checking for max_fail_percentage 44109 1727204241.51814: checking to see if all hosts have failed and the running result is not ok 44109 1727204241.51815: done checking to see if all hosts have failed 44109 1727204241.51815: getting the remaining hosts for this loop 44109 1727204241.51817: done getting the remaining hosts for this loop 44109 1727204241.51820: getting the next task for host managed-node1 44109 1727204241.51826: done getting next task for host managed-node1 44109 1727204241.51947: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44109 1727204241.51950: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204241.51963: getting variables 44109 1727204241.51964: in VariableManager get_vars() 44109 1727204241.51999: Calling all_inventory to load vars for managed-node1 44109 1727204241.52002: Calling groups_inventory to load vars for managed-node1 44109 1727204241.52004: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204241.52016: Calling all_plugins_play to load vars for managed-node1 44109 1727204241.52018: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204241.52021: Calling groups_plugins_play to load vars for managed-node1 44109 1727204241.53440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204241.56632: done with get_vars() 44109 1727204241.56665: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:21 -0400 (0:00:01.247) 0:00:18.364 ***** 44109 1727204241.56895: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44109 1727204241.56897: Creating lock for fedora.linux_system_roles.network_state 44109 1727204241.57400: worker is 1 (out of 1 available) 44109 1727204241.57410: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44109 1727204241.57422: done queuing things up, now waiting for results queue to drain 44109 1727204241.57423: waiting for pending results... 44109 1727204241.57770: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 44109 1727204241.57870: in run() - task 028d2410-947f-ed67-a560-000000000028 44109 1727204241.57874: variable 'ansible_search_path' from source: unknown 44109 1727204241.57878: variable 'ansible_search_path' from source: unknown 44109 1727204241.57884: calling self._execute() 44109 1727204241.57993: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.58002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.58017: variable 'omit' from source: magic vars 44109 1727204241.58716: variable 'ansible_distribution_major_version' from source: facts 44109 1727204241.58733: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204241.58864: variable 'network_state' from source: role '' defaults 44109 1727204241.58881: Evaluated conditional (network_state != {}): False 44109 1727204241.58888: when evaluation is False, skipping this task 44109 1727204241.58895: _execute() done 44109 1727204241.58931: dumping result to json 44109 1727204241.58934: done dumping result, returning 44109 1727204241.58937: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-ed67-a560-000000000028] 44109 1727204241.58939: sending task result for task 028d2410-947f-ed67-a560-000000000028 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204241.59088: no more pending results, returning what we have 44109 1727204241.59093: results queue empty 44109 1727204241.59094: checking for any_errors_fatal 44109 1727204241.59281: done checking for any_errors_fatal 44109 1727204241.59283: checking for max_fail_percentage 44109 1727204241.59285: done checking for max_fail_percentage 44109 1727204241.59286: checking to see if all hosts have failed and the running result is not ok 44109 1727204241.59287: done checking to see if all hosts have failed 44109 1727204241.59288: getting the remaining hosts for this loop 44109 1727204241.59289: done getting the remaining hosts for this loop 44109 1727204241.59293: getting the next task for host managed-node1 44109 1727204241.59301: done getting next task for host managed-node1 44109 1727204241.59304: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44109 1727204241.59308: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204241.59327: getting variables 44109 1727204241.59329: in VariableManager get_vars() 44109 1727204241.59367: Calling all_inventory to load vars for managed-node1 44109 1727204241.59370: Calling groups_inventory to load vars for managed-node1 44109 1727204241.59372: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204241.59488: Calling all_plugins_play to load vars for managed-node1 44109 1727204241.59491: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204241.59498: Calling groups_plugins_play to load vars for managed-node1 44109 1727204241.60189: done sending task result for task 028d2410-947f-ed67-a560-000000000028 44109 1727204241.60192: WORKER PROCESS EXITING 44109 1727204241.62693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204241.65440: done with get_vars() 44109 1727204241.65474: done getting variables 44109 1727204241.65540: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.087) 0:00:18.452 ***** 44109 1727204241.65581: entering _queue_task() for managed-node1/debug 44109 1727204241.66411: worker is 1 (out of 1 available) 44109 1727204241.66426: exiting _queue_task() for managed-node1/debug 44109 1727204241.66460: done queuing things up, now waiting for results queue to drain 44109 1727204241.66461: waiting for pending results... 44109 1727204241.66684: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44109 1727204241.66808: in run() - task 028d2410-947f-ed67-a560-000000000029 44109 1727204241.66827: variable 'ansible_search_path' from source: unknown 44109 1727204241.66830: variable 'ansible_search_path' from source: unknown 44109 1727204241.66865: calling self._execute() 44109 1727204241.66962: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.66966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.66977: variable 'omit' from source: magic vars 44109 1727204241.67369: variable 'ansible_distribution_major_version' from source: facts 44109 1727204241.67381: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204241.67387: variable 'omit' from source: magic vars 44109 1727204241.67449: variable 'omit' from source: magic vars 44109 1727204241.67537: variable 'omit' from source: magic vars 44109 1727204241.67540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204241.67581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204241.67593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204241.67662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204241.67665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204241.67668: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204241.67670: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.67672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.67771: Set connection var ansible_connection to ssh 44109 1727204241.67774: Set connection var ansible_timeout to 10 44109 1727204241.67860: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204241.67863: Set connection var ansible_pipelining to False 44109 1727204241.67865: Set connection var ansible_shell_executable to /bin/sh 44109 1727204241.67868: Set connection var ansible_shell_type to sh 44109 1727204241.67870: variable 'ansible_shell_executable' from source: unknown 44109 1727204241.67871: variable 'ansible_connection' from source: unknown 44109 1727204241.67874: variable 'ansible_module_compression' from source: unknown 44109 1727204241.67877: variable 'ansible_shell_type' from source: unknown 44109 1727204241.67879: variable 'ansible_shell_executable' from source: unknown 44109 1727204241.67881: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.67883: variable 'ansible_pipelining' from source: unknown 44109 1727204241.67885: variable 'ansible_timeout' from source: unknown 44109 1727204241.67887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.67980: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204241.68015: variable 'omit' from source: magic vars 44109 1727204241.68019: starting attempt loop 44109 1727204241.68022: running the handler 44109 1727204241.68202: variable '__network_connections_result' from source: set_fact 44109 1727204241.68223: handler run complete 44109 1727204241.68249: attempt loop complete, returning result 44109 1727204241.68252: _execute() done 44109 1727204241.68255: dumping result to json 44109 1727204241.68258: done dumping result, returning 44109 1727204241.68265: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-ed67-a560-000000000029] 44109 1727204241.68267: sending task result for task 028d2410-947f-ed67-a560-000000000029 44109 1727204241.68612: done sending task result for task 028d2410-947f-ed67-a560-000000000029 44109 1727204241.68615: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664 (not-active)" ] } 44109 1727204241.68702: no more pending results, returning what we have 44109 1727204241.68706: results queue empty 44109 1727204241.68707: checking for any_errors_fatal 44109 1727204241.68713: done checking for any_errors_fatal 44109 1727204241.68714: checking for max_fail_percentage 44109 1727204241.68715: done checking for max_fail_percentage 44109 1727204241.68716: checking to see if all hosts have failed and the running result is not ok 44109 1727204241.68717: done checking to see if all hosts have failed 44109 1727204241.68718: getting the remaining hosts for this loop 44109 1727204241.68719: done getting the remaining hosts for this loop 44109 1727204241.68723: getting the next task for host managed-node1 44109 1727204241.68728: done getting next task for host managed-node1 44109 1727204241.68736: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44109 1727204241.68739: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204241.68749: getting variables 44109 1727204241.68751: in VariableManager get_vars() 44109 1727204241.68792: Calling all_inventory to load vars for managed-node1 44109 1727204241.68800: Calling groups_inventory to load vars for managed-node1 44109 1727204241.68803: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204241.68812: Calling all_plugins_play to load vars for managed-node1 44109 1727204241.68815: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204241.68817: Calling groups_plugins_play to load vars for managed-node1 44109 1727204241.70439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204241.72146: done with get_vars() 44109 1727204241.72181: done getting variables 44109 1727204241.72252: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.067) 0:00:18.519 ***** 44109 1727204241.72307: entering _queue_task() for managed-node1/debug 44109 1727204241.72761: worker is 1 (out of 1 available) 44109 1727204241.72773: exiting _queue_task() for managed-node1/debug 44109 1727204241.72787: done queuing things up, now waiting for results queue to drain 44109 1727204241.72788: waiting for pending results... 44109 1727204241.73065: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44109 1727204241.73318: in run() - task 028d2410-947f-ed67-a560-00000000002a 44109 1727204241.73322: variable 'ansible_search_path' from source: unknown 44109 1727204241.73324: variable 'ansible_search_path' from source: unknown 44109 1727204241.73327: calling self._execute() 44109 1727204241.73380: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.73384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.73386: variable 'omit' from source: magic vars 44109 1727204241.73884: variable 'ansible_distribution_major_version' from source: facts 44109 1727204241.73888: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204241.73890: variable 'omit' from source: magic vars 44109 1727204241.73927: variable 'omit' from source: magic vars 44109 1727204241.73973: variable 'omit' from source: magic vars 44109 1727204241.74017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204241.74065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204241.74087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204241.74105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204241.74127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204241.74157: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204241.74168: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.74171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.74296: Set connection var ansible_connection to ssh 44109 1727204241.74302: Set connection var ansible_timeout to 10 44109 1727204241.74333: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204241.74336: Set connection var ansible_pipelining to False 44109 1727204241.74338: Set connection var ansible_shell_executable to /bin/sh 44109 1727204241.74341: Set connection var ansible_shell_type to sh 44109 1727204241.74350: variable 'ansible_shell_executable' from source: unknown 44109 1727204241.74402: variable 'ansible_connection' from source: unknown 44109 1727204241.74408: variable 'ansible_module_compression' from source: unknown 44109 1727204241.74411: variable 'ansible_shell_type' from source: unknown 44109 1727204241.74416: variable 'ansible_shell_executable' from source: unknown 44109 1727204241.74419: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.74421: variable 'ansible_pipelining' from source: unknown 44109 1727204241.74424: variable 'ansible_timeout' from source: unknown 44109 1727204241.74426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.74683: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204241.74687: variable 'omit' from source: magic vars 44109 1727204241.74689: starting attempt loop 44109 1727204241.74692: running the handler 44109 1727204241.74694: variable '__network_connections_result' from source: set_fact 44109 1727204241.74701: variable '__network_connections_result' from source: set_fact 44109 1727204241.75400: handler run complete 44109 1727204241.75474: attempt loop complete, returning result 44109 1727204241.75479: _execute() done 44109 1727204241.75482: dumping result to json 44109 1727204241.75488: done dumping result, returning 44109 1727204241.75498: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-ed67-a560-00000000002a] 44109 1727204241.75500: sending task result for task 028d2410-947f-ed67-a560-00000000002a ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 51eb3d23-af2d-42f8-aa46-41043d97d664 (not-active)" ] } } 44109 1727204241.76088: no more pending results, returning what we have 44109 1727204241.76091: results queue empty 44109 1727204241.76092: checking for any_errors_fatal 44109 1727204241.76098: done checking for any_errors_fatal 44109 1727204241.76099: checking for max_fail_percentage 44109 1727204241.76100: done checking for max_fail_percentage 44109 1727204241.76101: checking to see if all hosts have failed and the running result is not ok 44109 1727204241.76102: done checking to see if all hosts have failed 44109 1727204241.76103: getting the remaining hosts for this loop 44109 1727204241.76104: done getting the remaining hosts for this loop 44109 1727204241.76107: getting the next task for host managed-node1 44109 1727204241.76112: done getting next task for host managed-node1 44109 1727204241.76116: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44109 1727204241.76118: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204241.76128: getting variables 44109 1727204241.76129: in VariableManager get_vars() 44109 1727204241.76158: Calling all_inventory to load vars for managed-node1 44109 1727204241.76161: Calling groups_inventory to load vars for managed-node1 44109 1727204241.76163: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204241.76170: Calling all_plugins_play to load vars for managed-node1 44109 1727204241.76172: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204241.76180: Calling groups_plugins_play to load vars for managed-node1 44109 1727204241.76190: done sending task result for task 028d2410-947f-ed67-a560-00000000002a 44109 1727204241.76192: WORKER PROCESS EXITING 44109 1727204241.77571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204241.79374: done with get_vars() 44109 1727204241.79398: done getting variables 44109 1727204241.79478: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.072) 0:00:18.591 ***** 44109 1727204241.79519: entering _queue_task() for managed-node1/debug 44109 1727204241.80020: worker is 1 (out of 1 available) 44109 1727204241.80031: exiting _queue_task() for managed-node1/debug 44109 1727204241.80043: done queuing things up, now waiting for results queue to drain 44109 1727204241.80044: waiting for pending results... 44109 1727204241.80252: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44109 1727204241.80392: in run() - task 028d2410-947f-ed67-a560-00000000002b 44109 1727204241.80418: variable 'ansible_search_path' from source: unknown 44109 1727204241.80426: variable 'ansible_search_path' from source: unknown 44109 1727204241.80463: calling self._execute() 44109 1727204241.80563: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.80574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.80595: variable 'omit' from source: magic vars 44109 1727204241.81119: variable 'ansible_distribution_major_version' from source: facts 44109 1727204241.81195: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204241.81435: variable 'network_state' from source: role '' defaults 44109 1727204241.81499: Evaluated conditional (network_state != {}): False 44109 1727204241.81508: when evaluation is False, skipping this task 44109 1727204241.81515: _execute() done 44109 1727204241.81522: dumping result to json 44109 1727204241.81527: done dumping result, returning 44109 1727204241.81538: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-ed67-a560-00000000002b] 44109 1727204241.81715: sending task result for task 028d2410-947f-ed67-a560-00000000002b 44109 1727204241.81779: done sending task result for task 028d2410-947f-ed67-a560-00000000002b 44109 1727204241.81782: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 44109 1727204241.81867: no more pending results, returning what we have 44109 1727204241.81872: results queue empty 44109 1727204241.81873: checking for any_errors_fatal 44109 1727204241.81889: done checking for any_errors_fatal 44109 1727204241.81890: checking for max_fail_percentage 44109 1727204241.81891: done checking for max_fail_percentage 44109 1727204241.81893: checking to see if all hosts have failed and the running result is not ok 44109 1727204241.81894: done checking to see if all hosts have failed 44109 1727204241.81894: getting the remaining hosts for this loop 44109 1727204241.81896: done getting the remaining hosts for this loop 44109 1727204241.81899: getting the next task for host managed-node1 44109 1727204241.81907: done getting next task for host managed-node1 44109 1727204241.81911: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44109 1727204241.81915: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204241.82085: getting variables 44109 1727204241.82088: in VariableManager get_vars() 44109 1727204241.82126: Calling all_inventory to load vars for managed-node1 44109 1727204241.82129: Calling groups_inventory to load vars for managed-node1 44109 1727204241.82132: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204241.82143: Calling all_plugins_play to load vars for managed-node1 44109 1727204241.82146: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204241.82149: Calling groups_plugins_play to load vars for managed-node1 44109 1727204241.83777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204241.85525: done with get_vars() 44109 1727204241.85546: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.061) 0:00:18.652 ***** 44109 1727204241.85647: entering _queue_task() for managed-node1/ping 44109 1727204241.85649: Creating lock for ping 44109 1727204241.85991: worker is 1 (out of 1 available) 44109 1727204241.86004: exiting _queue_task() for managed-node1/ping 44109 1727204241.86014: done queuing things up, now waiting for results queue to drain 44109 1727204241.86015: waiting for pending results... 44109 1727204241.86307: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 44109 1727204241.86441: in run() - task 028d2410-947f-ed67-a560-00000000002c 44109 1727204241.86463: variable 'ansible_search_path' from source: unknown 44109 1727204241.86471: variable 'ansible_search_path' from source: unknown 44109 1727204241.86518: calling self._execute() 44109 1727204241.86616: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.86728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.86731: variable 'omit' from source: magic vars 44109 1727204241.87015: variable 'ansible_distribution_major_version' from source: facts 44109 1727204241.87031: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204241.87042: variable 'omit' from source: magic vars 44109 1727204241.87116: variable 'omit' from source: magic vars 44109 1727204241.87161: variable 'omit' from source: magic vars 44109 1727204241.87207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204241.87247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204241.87282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204241.87305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204241.87323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204241.87383: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204241.87386: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.87389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.87478: Set connection var ansible_connection to ssh 44109 1727204241.87580: Set connection var ansible_timeout to 10 44109 1727204241.87583: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204241.87584: Set connection var ansible_pipelining to False 44109 1727204241.87586: Set connection var ansible_shell_executable to /bin/sh 44109 1727204241.87588: Set connection var ansible_shell_type to sh 44109 1727204241.87589: variable 'ansible_shell_executable' from source: unknown 44109 1727204241.87592: variable 'ansible_connection' from source: unknown 44109 1727204241.87594: variable 'ansible_module_compression' from source: unknown 44109 1727204241.87596: variable 'ansible_shell_type' from source: unknown 44109 1727204241.87598: variable 'ansible_shell_executable' from source: unknown 44109 1727204241.87599: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204241.87601: variable 'ansible_pipelining' from source: unknown 44109 1727204241.87602: variable 'ansible_timeout' from source: unknown 44109 1727204241.87604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204241.87946: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204241.88055: variable 'omit' from source: magic vars 44109 1727204241.88058: starting attempt loop 44109 1727204241.88061: running the handler 44109 1727204241.88063: _low_level_execute_command(): starting 44109 1727204241.88065: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204241.89441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204241.89444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204241.89447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204241.89449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204241.89451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204241.89571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204241.89678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204241.89784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204241.91740: stdout chunk (state=3): >>>/root <<< 44109 1727204241.91769: stdout chunk (state=3): >>><<< 44109 1727204241.91772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204241.91778: stderr chunk (state=3): >>><<< 44109 1727204241.91803: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204241.91821: _low_level_execute_command(): starting 44109 1727204241.91827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256 `" && echo ansible-tmp-1727204241.918028-45981-60956695391256="` echo /root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256 `" ) && sleep 0' 44109 1727204241.92422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204241.92515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204241.92535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204241.92555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204241.92565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204241.92704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204241.94986: stdout chunk (state=3): >>>ansible-tmp-1727204241.918028-45981-60956695391256=/root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256 <<< 44109 1727204241.94990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204241.94992: stdout chunk (state=3): >>><<< 44109 1727204241.94995: stderr chunk (state=3): >>><<< 44109 1727204241.95187: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204241.918028-45981-60956695391256=/root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204241.95191: variable 'ansible_module_compression' from source: unknown 44109 1727204241.95228: ANSIBALLZ: Using lock for ping 44109 1727204241.95288: ANSIBALLZ: Acquiring lock 44109 1727204241.95297: ANSIBALLZ: Lock acquired: 139907468664784 44109 1727204241.95306: ANSIBALLZ: Creating module 44109 1727204242.12733: ANSIBALLZ: Writing module into payload 44109 1727204242.12774: ANSIBALLZ: Writing module 44109 1727204242.12793: ANSIBALLZ: Renaming module 44109 1727204242.12799: ANSIBALLZ: Done creating module 44109 1727204242.12814: variable 'ansible_facts' from source: unknown 44109 1727204242.12858: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/AnsiballZ_ping.py 44109 1727204242.12962: Sending initial data 44109 1727204242.12966: Sent initial data (151 bytes) 44109 1727204242.13420: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204242.13423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.13426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204242.13428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.13481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204242.13485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204242.13496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.13582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.15382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204242.15439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204242.15534: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp_dvoz7mt /root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/AnsiballZ_ping.py <<< 44109 1727204242.15542: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/AnsiballZ_ping.py" <<< 44109 1727204242.15607: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp_dvoz7mt" to remote "/root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/AnsiballZ_ping.py" <<< 44109 1727204242.16283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204242.16315: stderr chunk (state=3): >>><<< 44109 1727204242.16318: stdout chunk (state=3): >>><<< 44109 1727204242.16323: done transferring module to remote 44109 1727204242.16332: _low_level_execute_command(): starting 44109 1727204242.16337: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/ /root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/AnsiballZ_ping.py && sleep 0' 44109 1727204242.16780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204242.16783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204242.16785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.16788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204242.16790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.16841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204242.16844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.16933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.18904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204242.18926: stderr chunk (state=3): >>><<< 44109 1727204242.18929: stdout chunk (state=3): >>><<< 44109 1727204242.18943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204242.18946: _low_level_execute_command(): starting 44109 1727204242.18951: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/AnsiballZ_ping.py && sleep 0' 44109 1727204242.19380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204242.19414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204242.19418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.19420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204242.19424: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204242.19426: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.19472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204242.19478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204242.19482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.19570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.35858: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44109 1727204242.37447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204242.37488: stderr chunk (state=3): >>><<< 44109 1727204242.37511: stdout chunk (state=3): >>><<< 44109 1727204242.37561: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204242.37594: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204242.37598: _low_level_execute_command(): starting 44109 1727204242.37600: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204241.918028-45981-60956695391256/ > /dev/null 2>&1 && sleep 0' 44109 1727204242.38483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204242.38700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.38789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.40821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204242.41023: stdout chunk (state=3): >>><<< 44109 1727204242.41026: stderr chunk (state=3): >>><<< 44109 1727204242.41028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204242.41033: handler run complete 44109 1727204242.41035: attempt loop complete, returning result 44109 1727204242.41037: _execute() done 44109 1727204242.41038: dumping result to json 44109 1727204242.41040: done dumping result, returning 44109 1727204242.41041: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-ed67-a560-00000000002c] 44109 1727204242.41043: sending task result for task 028d2410-947f-ed67-a560-00000000002c 44109 1727204242.41114: done sending task result for task 028d2410-947f-ed67-a560-00000000002c 44109 1727204242.41117: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 44109 1727204242.41187: no more pending results, returning what we have 44109 1727204242.41192: results queue empty 44109 1727204242.41193: checking for any_errors_fatal 44109 1727204242.41202: done checking for any_errors_fatal 44109 1727204242.41203: checking for max_fail_percentage 44109 1727204242.41205: done checking for max_fail_percentage 44109 1727204242.41206: checking to see if all hosts have failed and the running result is not ok 44109 1727204242.41207: done checking to see if all hosts have failed 44109 1727204242.41208: getting the remaining hosts for this loop 44109 1727204242.41209: done getting the remaining hosts for this loop 44109 1727204242.41214: getting the next task for host managed-node1 44109 1727204242.41224: done getting next task for host managed-node1 44109 1727204242.41227: ^ task is: TASK: meta (role_complete) 44109 1727204242.41230: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204242.41242: getting variables 44109 1727204242.41244: in VariableManager get_vars() 44109 1727204242.41383: Calling all_inventory to load vars for managed-node1 44109 1727204242.41386: Calling groups_inventory to load vars for managed-node1 44109 1727204242.41389: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204242.41398: Calling all_plugins_play to load vars for managed-node1 44109 1727204242.41400: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204242.41402: Calling groups_plugins_play to load vars for managed-node1 44109 1727204242.43705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204242.45287: done with get_vars() 44109 1727204242.45318: done getting variables 44109 1727204242.45404: done queuing things up, now waiting for results queue to drain 44109 1727204242.45406: results queue empty 44109 1727204242.45407: checking for any_errors_fatal 44109 1727204242.45410: done checking for any_errors_fatal 44109 1727204242.45411: checking for max_fail_percentage 44109 1727204242.45414: done checking for max_fail_percentage 44109 1727204242.45415: checking to see if all hosts have failed and the running result is not ok 44109 1727204242.45416: done checking to see if all hosts have failed 44109 1727204242.45416: getting the remaining hosts for this loop 44109 1727204242.45417: done getting the remaining hosts for this loop 44109 1727204242.45421: getting the next task for host managed-node1 44109 1727204242.45425: done getting next task for host managed-node1 44109 1727204242.45428: ^ task is: TASK: Get the routing rule for looking up the table 30200 44109 1727204242.45429: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204242.45432: getting variables 44109 1727204242.45433: in VariableManager get_vars() 44109 1727204242.45447: Calling all_inventory to load vars for managed-node1 44109 1727204242.45449: Calling groups_inventory to load vars for managed-node1 44109 1727204242.45451: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204242.45456: Calling all_plugins_play to load vars for managed-node1 44109 1727204242.45458: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204242.45461: Calling groups_plugins_play to load vars for managed-node1 44109 1727204242.47334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204242.50115: done with get_vars() 44109 1727204242.50145: done getting variables 44109 1727204242.50406: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30200] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:115 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.647) 0:00:19.300 ***** 44109 1727204242.50439: entering _queue_task() for managed-node1/command 44109 1727204242.50892: worker is 1 (out of 1 available) 44109 1727204242.50905: exiting _queue_task() for managed-node1/command 44109 1727204242.50919: done queuing things up, now waiting for results queue to drain 44109 1727204242.50920: waiting for pending results... 44109 1727204242.51305: running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30200 44109 1727204242.51383: in run() - task 028d2410-947f-ed67-a560-00000000005c 44109 1727204242.51388: variable 'ansible_search_path' from source: unknown 44109 1727204242.51390: calling self._execute() 44109 1727204242.51484: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204242.51496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204242.51522: variable 'omit' from source: magic vars 44109 1727204242.52141: variable 'ansible_distribution_major_version' from source: facts 44109 1727204242.52183: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204242.52404: variable 'ansible_distribution_major_version' from source: facts 44109 1727204242.52419: Evaluated conditional (ansible_distribution_major_version != "7"): True 44109 1727204242.52432: variable 'omit' from source: magic vars 44109 1727204242.52458: variable 'omit' from source: magic vars 44109 1727204242.52507: variable 'omit' from source: magic vars 44109 1727204242.52630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204242.52723: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204242.52817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204242.52842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204242.53007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204242.53013: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204242.53016: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204242.53019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204242.53227: Set connection var ansible_connection to ssh 44109 1727204242.53241: Set connection var ansible_timeout to 10 44109 1727204242.53289: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204242.53303: Set connection var ansible_pipelining to False 44109 1727204242.53316: Set connection var ansible_shell_executable to /bin/sh 44109 1727204242.53329: Set connection var ansible_shell_type to sh 44109 1727204242.53360: variable 'ansible_shell_executable' from source: unknown 44109 1727204242.53662: variable 'ansible_connection' from source: unknown 44109 1727204242.53666: variable 'ansible_module_compression' from source: unknown 44109 1727204242.53668: variable 'ansible_shell_type' from source: unknown 44109 1727204242.53670: variable 'ansible_shell_executable' from source: unknown 44109 1727204242.53672: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204242.53674: variable 'ansible_pipelining' from source: unknown 44109 1727204242.53677: variable 'ansible_timeout' from source: unknown 44109 1727204242.53679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204242.53754: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204242.53980: variable 'omit' from source: magic vars 44109 1727204242.53984: starting attempt loop 44109 1727204242.53987: running the handler 44109 1727204242.53989: _low_level_execute_command(): starting 44109 1727204242.53992: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204242.55396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.55517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204242.55534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204242.55993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.56097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.57926: stdout chunk (state=3): >>>/root <<< 44109 1727204242.58053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204242.58056: stdout chunk (state=3): >>><<< 44109 1727204242.58066: stderr chunk (state=3): >>><<< 44109 1727204242.58091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204242.58107: _low_level_execute_command(): starting 44109 1727204242.58115: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254 `" && echo ansible-tmp-1727204242.5809286-46013-12531845135254="` echo /root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254 `" ) && sleep 0' 44109 1727204242.59310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204242.59337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204242.59341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204242.59343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204242.59354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204242.59366: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204242.59368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.59447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204242.59450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204242.59492: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204242.59694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.59908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.62034: stdout chunk (state=3): >>>ansible-tmp-1727204242.5809286-46013-12531845135254=/root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254 <<< 44109 1727204242.62189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204242.62193: stdout chunk (state=3): >>><<< 44109 1727204242.62236: stderr chunk (state=3): >>><<< 44109 1727204242.62271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204242.5809286-46013-12531845135254=/root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204242.62306: variable 'ansible_module_compression' from source: unknown 44109 1727204242.62418: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204242.62662: variable 'ansible_facts' from source: unknown 44109 1727204242.62941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/AnsiballZ_command.py 44109 1727204242.63394: Sending initial data 44109 1727204242.63398: Sent initial data (155 bytes) 44109 1727204242.64796: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.64824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204242.65044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.65196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.66884: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204242.66957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204242.67094: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmptb69eras /root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/AnsiballZ_command.py <<< 44109 1727204242.67097: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/AnsiballZ_command.py" <<< 44109 1727204242.67123: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmptb69eras" to remote "/root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/AnsiballZ_command.py" <<< 44109 1727204242.68850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204242.68867: stderr chunk (state=3): >>><<< 44109 1727204242.68870: stdout chunk (state=3): >>><<< 44109 1727204242.68981: done transferring module to remote 44109 1727204242.68986: _low_level_execute_command(): starting 44109 1727204242.68991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/ /root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/AnsiballZ_command.py && sleep 0' 44109 1727204242.70306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204242.70334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204242.70393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.70443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204242.70456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204242.70517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.70793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.72652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204242.72905: stderr chunk (state=3): >>><<< 44109 1727204242.72909: stdout chunk (state=3): >>><<< 44109 1727204242.72914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204242.72917: _low_level_execute_command(): starting 44109 1727204242.72919: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/AnsiballZ_command.py && sleep 0' 44109 1727204242.74280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204242.74284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204242.74286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204242.74288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204242.74290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204242.74292: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204242.74294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.74295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204242.74297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204242.74299: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204242.74301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204242.74303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204242.74305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204242.74307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204242.74309: stderr chunk (state=3): >>>debug2: match found <<< 44109 1727204242.74314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.74316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204242.74383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204242.74493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.74750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.91913: stdout chunk (state=3): >>> {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos throughput lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-24 14:57:22.911681", "end": "2024-09-24 14:57:22.917266", "delta": "0:00:00.005585", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204242.94024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204242.94028: stdout chunk (state=3): >>><<< 44109 1727204242.94033: stderr chunk (state=3): >>><<< 44109 1727204242.94099: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos throughput lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-24 14:57:22.911681", "end": "2024-09-24 14:57:22.917266", "delta": "0:00:00.005585", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204242.94159: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204242.94167: _low_level_execute_command(): starting 44109 1727204242.94169: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204242.5809286-46013-12531845135254/ > /dev/null 2>&1 && sleep 0' 44109 1727204242.94866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204242.94870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.94873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204242.94875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204242.94924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204242.94935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204242.95019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204242.97083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204242.97110: stderr chunk (state=3): >>><<< 44109 1727204242.97113: stdout chunk (state=3): >>><<< 44109 1727204242.97130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204242.97141: handler run complete 44109 1727204242.97479: Evaluated conditional (False): False 44109 1727204242.97482: attempt loop complete, returning result 44109 1727204242.97484: _execute() done 44109 1727204242.97486: dumping result to json 44109 1727204242.97488: done dumping result, returning 44109 1727204242.97490: done running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30200 [028d2410-947f-ed67-a560-00000000005c] 44109 1727204242.97492: sending task result for task 028d2410-947f-ed67-a560-00000000005c ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30200" ], "delta": "0:00:00.005585", "end": "2024-09-24 14:57:22.917266", "rc": 0, "start": "2024-09-24 14:57:22.911681" } STDOUT: 30200: from 198.51.100.58/26 lookup 30200 proto static 30201: from all fwmark 0x1/0x1 lookup 30200 proto static 30202: from all ipproto tcp lookup 30200 proto static 30203: from all sport 128-256 lookup 30200 proto static 30204: from all tos throughput lookup 30200 proto static 44109 1727204242.97663: no more pending results, returning what we have 44109 1727204242.97667: results queue empty 44109 1727204242.97668: checking for any_errors_fatal 44109 1727204242.97670: done checking for any_errors_fatal 44109 1727204242.97671: checking for max_fail_percentage 44109 1727204242.97673: done checking for max_fail_percentage 44109 1727204242.97674: checking to see if all hosts have failed and the running result is not ok 44109 1727204242.97720: done checking to see if all hosts have failed 44109 1727204242.97722: getting the remaining hosts for this loop 44109 1727204242.97724: done getting the remaining hosts for this loop 44109 1727204242.97728: getting the next task for host managed-node1 44109 1727204242.97735: done getting next task for host managed-node1 44109 1727204242.97738: ^ task is: TASK: Get the routing rule for looking up the table 30400 44109 1727204242.97741: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204242.97745: getting variables 44109 1727204242.97747: in VariableManager get_vars() 44109 1727204242.97791: Calling all_inventory to load vars for managed-node1 44109 1727204242.97795: Calling groups_inventory to load vars for managed-node1 44109 1727204242.97797: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204242.97925: Calling all_plugins_play to load vars for managed-node1 44109 1727204242.97930: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204242.97934: Calling groups_plugins_play to load vars for managed-node1 44109 1727204242.98693: done sending task result for task 028d2410-947f-ed67-a560-00000000005c 44109 1727204242.98700: WORKER PROCESS EXITING 44109 1727204242.99692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204243.02189: done with get_vars() 44109 1727204243.02224: done getting variables 44109 1727204243.02314: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30400] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:122 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.519) 0:00:19.819 ***** 44109 1727204243.02346: entering _queue_task() for managed-node1/command 44109 1727204243.02819: worker is 1 (out of 1 available) 44109 1727204243.02946: exiting _queue_task() for managed-node1/command 44109 1727204243.02957: done queuing things up, now waiting for results queue to drain 44109 1727204243.02958: waiting for pending results... 44109 1727204243.03155: running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30400 44109 1727204243.03231: in run() - task 028d2410-947f-ed67-a560-00000000005d 44109 1727204243.03243: variable 'ansible_search_path' from source: unknown 44109 1727204243.03283: calling self._execute() 44109 1727204243.03355: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.03359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.03369: variable 'omit' from source: magic vars 44109 1727204243.03662: variable 'ansible_distribution_major_version' from source: facts 44109 1727204243.03671: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204243.03755: variable 'ansible_distribution_major_version' from source: facts 44109 1727204243.03758: Evaluated conditional (ansible_distribution_major_version != "7"): True 44109 1727204243.03766: variable 'omit' from source: magic vars 44109 1727204243.03784: variable 'omit' from source: magic vars 44109 1727204243.03811: variable 'omit' from source: magic vars 44109 1727204243.03848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204243.03874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204243.03891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204243.03906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204243.03919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204243.03944: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204243.03947: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.03949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.04024: Set connection var ansible_connection to ssh 44109 1727204243.04028: Set connection var ansible_timeout to 10 44109 1727204243.04036: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204243.04038: Set connection var ansible_pipelining to False 44109 1727204243.04046: Set connection var ansible_shell_executable to /bin/sh 44109 1727204243.04048: Set connection var ansible_shell_type to sh 44109 1727204243.04067: variable 'ansible_shell_executable' from source: unknown 44109 1727204243.04070: variable 'ansible_connection' from source: unknown 44109 1727204243.04073: variable 'ansible_module_compression' from source: unknown 44109 1727204243.04076: variable 'ansible_shell_type' from source: unknown 44109 1727204243.04079: variable 'ansible_shell_executable' from source: unknown 44109 1727204243.04081: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.04085: variable 'ansible_pipelining' from source: unknown 44109 1727204243.04087: variable 'ansible_timeout' from source: unknown 44109 1727204243.04091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.04198: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204243.04206: variable 'omit' from source: magic vars 44109 1727204243.04211: starting attempt loop 44109 1727204243.04216: running the handler 44109 1727204243.04230: _low_level_execute_command(): starting 44109 1727204243.04238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204243.04760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.04764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.04766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.04768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.04825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.04828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.04830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.04910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.06683: stdout chunk (state=3): >>>/root <<< 44109 1727204243.06788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.06815: stderr chunk (state=3): >>><<< 44109 1727204243.06821: stdout chunk (state=3): >>><<< 44109 1727204243.06843: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.06854: _low_level_execute_command(): starting 44109 1727204243.06861: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627 `" && echo ansible-tmp-1727204243.0684242-46053-165197739467627="` echo /root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627 `" ) && sleep 0' 44109 1727204243.07320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204243.07324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204243.07334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.07336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.07339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.07377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.07382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.07384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.07472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.09568: stdout chunk (state=3): >>>ansible-tmp-1727204243.0684242-46053-165197739467627=/root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627 <<< 44109 1727204243.09682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.09707: stderr chunk (state=3): >>><<< 44109 1727204243.09710: stdout chunk (state=3): >>><<< 44109 1727204243.09728: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204243.0684242-46053-165197739467627=/root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.09754: variable 'ansible_module_compression' from source: unknown 44109 1727204243.09804: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204243.09835: variable 'ansible_facts' from source: unknown 44109 1727204243.09890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/AnsiballZ_command.py 44109 1727204243.09993: Sending initial data 44109 1727204243.09996: Sent initial data (156 bytes) 44109 1727204243.10458: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204243.10461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204243.10463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.10466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204243.10468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.10516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.10519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.10521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.10603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.12333: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 44109 1727204243.12338: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204243.12411: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204243.12493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp44tv68bv /root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/AnsiballZ_command.py <<< 44109 1727204243.12495: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/AnsiballZ_command.py" <<< 44109 1727204243.12560: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp44tv68bv" to remote "/root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/AnsiballZ_command.py" <<< 44109 1727204243.12563: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/AnsiballZ_command.py" <<< 44109 1727204243.13222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.13267: stderr chunk (state=3): >>><<< 44109 1727204243.13270: stdout chunk (state=3): >>><<< 44109 1727204243.13312: done transferring module to remote 44109 1727204243.13323: _low_level_execute_command(): starting 44109 1727204243.13327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/ /root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/AnsiballZ_command.py && sleep 0' 44109 1727204243.13748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204243.13755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204243.13779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.13786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.13788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.13843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.13850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.13852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.13930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.15879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.15906: stderr chunk (state=3): >>><<< 44109 1727204243.15909: stdout chunk (state=3): >>><<< 44109 1727204243.15926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.15929: _low_level_execute_command(): starting 44109 1727204243.15933: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/AnsiballZ_command.py && sleep 0' 44109 1727204243.16366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204243.16372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.16393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.16443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.16449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.16452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.16534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.33602: stdout chunk (state=3): >>> {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-24 14:57:23.330119", "end": "2024-09-24 14:57:23.334256", "delta": "0:00:00.004137", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204243.35685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204243.35689: stdout chunk (state=3): >>><<< 44109 1727204243.35692: stderr chunk (state=3): >>><<< 44109 1727204243.35694: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-24 14:57:23.330119", "end": "2024-09-24 14:57:23.334256", "delta": "0:00:00.004137", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204243.35698: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204243.35704: _low_level_execute_command(): starting 44109 1727204243.35706: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204243.0684242-46053-165197739467627/ > /dev/null 2>&1 && sleep 0' 44109 1727204243.36173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.36189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44109 1727204243.36205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.36291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.36387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.38864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.38877: stdout chunk (state=3): >>><<< 44109 1727204243.38904: stderr chunk (state=3): >>><<< 44109 1727204243.38938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.38950: handler run complete 44109 1727204243.38982: Evaluated conditional (False): False 44109 1727204243.39002: attempt loop complete, returning result 44109 1727204243.39015: _execute() done 44109 1727204243.39022: dumping result to json 44109 1727204243.39082: done dumping result, returning 44109 1727204243.39085: done running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30400 [028d2410-947f-ed67-a560-00000000005d] 44109 1727204243.39087: sending task result for task 028d2410-947f-ed67-a560-00000000005d ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30400" ], "delta": "0:00:00.004137", "end": "2024-09-24 14:57:23.334256", "rc": 0, "start": "2024-09-24 14:57:23.330119" } STDOUT: 30400: from all to 198.51.100.128/26 lookup 30400 proto static 30401: from all iif iiftest [detached] lookup 30400 proto static 30402: from all oif oiftest [detached] lookup 30400 proto static 30403: from all lookup 30400 proto static 44109 1727204243.39246: no more pending results, returning what we have 44109 1727204243.39250: results queue empty 44109 1727204243.39251: checking for any_errors_fatal 44109 1727204243.39258: done checking for any_errors_fatal 44109 1727204243.39259: checking for max_fail_percentage 44109 1727204243.39261: done checking for max_fail_percentage 44109 1727204243.39262: checking to see if all hosts have failed and the running result is not ok 44109 1727204243.39263: done checking to see if all hosts have failed 44109 1727204243.39263: getting the remaining hosts for this loop 44109 1727204243.39265: done getting the remaining hosts for this loop 44109 1727204243.39268: getting the next task for host managed-node1 44109 1727204243.39273: done getting next task for host managed-node1 44109 1727204243.39277: ^ task is: TASK: Get the routing rule for looking up the table 30600 44109 1727204243.39279: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204243.39283: getting variables 44109 1727204243.39285: in VariableManager get_vars() 44109 1727204243.39320: Calling all_inventory to load vars for managed-node1 44109 1727204243.39323: Calling groups_inventory to load vars for managed-node1 44109 1727204243.39325: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204243.39335: Calling all_plugins_play to load vars for managed-node1 44109 1727204243.39338: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204243.39340: Calling groups_plugins_play to load vars for managed-node1 44109 1727204243.39895: done sending task result for task 028d2410-947f-ed67-a560-00000000005d 44109 1727204243.39899: WORKER PROCESS EXITING 44109 1727204243.40544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204243.41385: done with get_vars() 44109 1727204243.41401: done getting variables 44109 1727204243.41448: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30600] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:129 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.391) 0:00:20.210 ***** 44109 1727204243.41468: entering _queue_task() for managed-node1/command 44109 1727204243.41714: worker is 1 (out of 1 available) 44109 1727204243.41727: exiting _queue_task() for managed-node1/command 44109 1727204243.41738: done queuing things up, now waiting for results queue to drain 44109 1727204243.41739: waiting for pending results... 44109 1727204243.41916: running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30600 44109 1727204243.41977: in run() - task 028d2410-947f-ed67-a560-00000000005e 44109 1727204243.41993: variable 'ansible_search_path' from source: unknown 44109 1727204243.42022: calling self._execute() 44109 1727204243.42098: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.42102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.42110: variable 'omit' from source: magic vars 44109 1727204243.42381: variable 'ansible_distribution_major_version' from source: facts 44109 1727204243.42391: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204243.42470: variable 'ansible_distribution_major_version' from source: facts 44109 1727204243.42474: Evaluated conditional (ansible_distribution_major_version != "7"): True 44109 1727204243.42482: variable 'omit' from source: magic vars 44109 1727204243.42498: variable 'omit' from source: magic vars 44109 1727204243.42528: variable 'omit' from source: magic vars 44109 1727204243.42560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204243.42588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204243.42604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204243.42620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204243.42629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204243.42655: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204243.42658: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.42660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.42733: Set connection var ansible_connection to ssh 44109 1727204243.42738: Set connection var ansible_timeout to 10 44109 1727204243.42745: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204243.42751: Set connection var ansible_pipelining to False 44109 1727204243.42756: Set connection var ansible_shell_executable to /bin/sh 44109 1727204243.42761: Set connection var ansible_shell_type to sh 44109 1727204243.42779: variable 'ansible_shell_executable' from source: unknown 44109 1727204243.42782: variable 'ansible_connection' from source: unknown 44109 1727204243.42784: variable 'ansible_module_compression' from source: unknown 44109 1727204243.42786: variable 'ansible_shell_type' from source: unknown 44109 1727204243.42789: variable 'ansible_shell_executable' from source: unknown 44109 1727204243.42791: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.42795: variable 'ansible_pipelining' from source: unknown 44109 1727204243.42797: variable 'ansible_timeout' from source: unknown 44109 1727204243.42801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.42908: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204243.42917: variable 'omit' from source: magic vars 44109 1727204243.42920: starting attempt loop 44109 1727204243.42923: running the handler 44109 1727204243.42937: _low_level_execute_command(): starting 44109 1727204243.42945: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204243.43460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.43463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.43466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.43468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.43528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.43531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.43533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.43618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.45404: stdout chunk (state=3): >>>/root <<< 44109 1727204243.45501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.45534: stderr chunk (state=3): >>><<< 44109 1727204243.45537: stdout chunk (state=3): >>><<< 44109 1727204243.45559: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.45570: _low_level_execute_command(): starting 44109 1727204243.45579: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265 `" && echo ansible-tmp-1727204243.455578-46070-201363686915265="` echo /root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265 `" ) && sleep 0' 44109 1727204243.46021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204243.46025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.46046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.46090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.46102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.46190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.48305: stdout chunk (state=3): >>>ansible-tmp-1727204243.455578-46070-201363686915265=/root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265 <<< 44109 1727204243.48419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.48447: stderr chunk (state=3): >>><<< 44109 1727204243.48450: stdout chunk (state=3): >>><<< 44109 1727204243.48465: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204243.455578-46070-201363686915265=/root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.48493: variable 'ansible_module_compression' from source: unknown 44109 1727204243.48538: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204243.48566: variable 'ansible_facts' from source: unknown 44109 1727204243.48624: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/AnsiballZ_command.py 44109 1727204243.48723: Sending initial data 44109 1727204243.48726: Sent initial data (155 bytes) 44109 1727204243.49141: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204243.49149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204243.49171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.49174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.49178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.49232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.49239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.49319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.51071: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 44109 1727204243.51078: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204243.51145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204243.51222: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp6cq0x3ar /root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/AnsiballZ_command.py <<< 44109 1727204243.51225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/AnsiballZ_command.py" <<< 44109 1727204243.51294: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp6cq0x3ar" to remote "/root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/AnsiballZ_command.py" <<< 44109 1727204243.51297: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/AnsiballZ_command.py" <<< 44109 1727204243.51980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.52021: stderr chunk (state=3): >>><<< 44109 1727204243.52025: stdout chunk (state=3): >>><<< 44109 1727204243.52065: done transferring module to remote 44109 1727204243.52074: _low_level_execute_command(): starting 44109 1727204243.52080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/ /root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/AnsiballZ_command.py && sleep 0' 44109 1727204243.52515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204243.52518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204243.52521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.52523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.52530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.52583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.52588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.52665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.54615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.54639: stderr chunk (state=3): >>><<< 44109 1727204243.54642: stdout chunk (state=3): >>><<< 44109 1727204243.54656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.54659: _low_level_execute_command(): starting 44109 1727204243.54662: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/AnsiballZ_command.py && sleep 0' 44109 1727204243.55080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.55088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.55106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204243.55111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.55163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.55166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.55257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.72099: stdout chunk (state=3): >>> {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-24 14:57:23.715127", "end": "2024-09-24 14:57:23.719056", "delta": "0:00:00.003929", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204243.74082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204243.74086: stdout chunk (state=3): >>><<< 44109 1727204243.74088: stderr chunk (state=3): >>><<< 44109 1727204243.74091: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-24 14:57:23.715127", "end": "2024-09-24 14:57:23.719056", "delta": "0:00:00.003929", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204243.74095: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 rule list table 30600', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204243.74098: _low_level_execute_command(): starting 44109 1727204243.74100: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204243.455578-46070-201363686915265/ > /dev/null 2>&1 && sleep 0' 44109 1727204243.75195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.75335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.75466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.75579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.77552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.77556: stdout chunk (state=3): >>><<< 44109 1727204243.77558: stderr chunk (state=3): >>><<< 44109 1727204243.77586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.77589: handler run complete 44109 1727204243.77617: Evaluated conditional (False): False 44109 1727204243.77631: attempt loop complete, returning result 44109 1727204243.77634: _execute() done 44109 1727204243.77636: dumping result to json 44109 1727204243.77638: done dumping result, returning 44109 1727204243.77646: done running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30600 [028d2410-947f-ed67-a560-00000000005e] 44109 1727204243.77648: sending task result for task 028d2410-947f-ed67-a560-00000000005e 44109 1727204243.77755: done sending task result for task 028d2410-947f-ed67-a560-00000000005e 44109 1727204243.77758: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "-6", "rule", "list", "table", "30600" ], "delta": "0:00:00.003929", "end": "2024-09-24 14:57:23.719056", "rc": 0, "start": "2024-09-24 14:57:23.715127" } STDOUT: 30600: from all to 2001:db8::4/32 lookup 30600 proto static 30601: not from all dport 128-256 lookup 30600 proto static 30602: from all lookup 30600 proto static 44109 1727204243.77879: no more pending results, returning what we have 44109 1727204243.77883: results queue empty 44109 1727204243.77884: checking for any_errors_fatal 44109 1727204243.77894: done checking for any_errors_fatal 44109 1727204243.77895: checking for max_fail_percentage 44109 1727204243.77898: done checking for max_fail_percentage 44109 1727204243.77899: checking to see if all hosts have failed and the running result is not ok 44109 1727204243.77900: done checking to see if all hosts have failed 44109 1727204243.77901: getting the remaining hosts for this loop 44109 1727204243.77902: done getting the remaining hosts for this loop 44109 1727204243.77907: getting the next task for host managed-node1 44109 1727204243.77913: done getting next task for host managed-node1 44109 1727204243.77916: ^ task is: TASK: Get the routing rule for looking up the table 'custom' 44109 1727204243.77919: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204243.77923: getting variables 44109 1727204243.77925: in VariableManager get_vars() 44109 1727204243.77965: Calling all_inventory to load vars for managed-node1 44109 1727204243.77968: Calling groups_inventory to load vars for managed-node1 44109 1727204243.77971: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204243.78118: Calling all_plugins_play to load vars for managed-node1 44109 1727204243.78123: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204243.78127: Calling groups_plugins_play to load vars for managed-node1 44109 1727204243.80017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204243.81754: done with get_vars() 44109 1727204243.81782: done getting variables 44109 1727204243.81848: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 'custom'] ****************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:136 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.404) 0:00:20.615 ***** 44109 1727204243.81883: entering _queue_task() for managed-node1/command 44109 1727204243.82278: worker is 1 (out of 1 available) 44109 1727204243.82291: exiting _queue_task() for managed-node1/command 44109 1727204243.82379: done queuing things up, now waiting for results queue to drain 44109 1727204243.82381: waiting for pending results... 44109 1727204243.82666: running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 'custom' 44109 1727204243.82700: in run() - task 028d2410-947f-ed67-a560-00000000005f 44109 1727204243.82706: variable 'ansible_search_path' from source: unknown 44109 1727204243.82748: calling self._execute() 44109 1727204243.82997: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.83001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.83003: variable 'omit' from source: magic vars 44109 1727204243.83274: variable 'ansible_distribution_major_version' from source: facts 44109 1727204243.83288: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204243.83407: variable 'ansible_distribution_major_version' from source: facts 44109 1727204243.83416: Evaluated conditional (ansible_distribution_major_version != "7"): True 44109 1727204243.83419: variable 'omit' from source: magic vars 44109 1727204243.83446: variable 'omit' from source: magic vars 44109 1727204243.83487: variable 'omit' from source: magic vars 44109 1727204243.83528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204243.83567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204243.83588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204243.83606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204243.83620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204243.83655: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204243.83658: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.83663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.83770: Set connection var ansible_connection to ssh 44109 1727204243.83778: Set connection var ansible_timeout to 10 44109 1727204243.83783: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204243.83838: Set connection var ansible_pipelining to False 44109 1727204243.83841: Set connection var ansible_shell_executable to /bin/sh 44109 1727204243.83843: Set connection var ansible_shell_type to sh 44109 1727204243.83845: variable 'ansible_shell_executable' from source: unknown 44109 1727204243.83848: variable 'ansible_connection' from source: unknown 44109 1727204243.83850: variable 'ansible_module_compression' from source: unknown 44109 1727204243.83852: variable 'ansible_shell_type' from source: unknown 44109 1727204243.83854: variable 'ansible_shell_executable' from source: unknown 44109 1727204243.83855: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204243.83857: variable 'ansible_pipelining' from source: unknown 44109 1727204243.83859: variable 'ansible_timeout' from source: unknown 44109 1727204243.83861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204243.84085: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204243.84088: variable 'omit' from source: magic vars 44109 1727204243.84090: starting attempt loop 44109 1727204243.84092: running the handler 44109 1727204243.84094: _low_level_execute_command(): starting 44109 1727204243.84096: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204243.84860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.84894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.84925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.85051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.86871: stdout chunk (state=3): >>>/root <<< 44109 1727204243.87052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.87191: stdout chunk (state=3): >>><<< 44109 1727204243.87194: stderr chunk (state=3): >>><<< 44109 1727204243.87197: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.87199: _low_level_execute_command(): starting 44109 1727204243.87203: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331 `" && echo ansible-tmp-1727204243.8709571-46091-174287264914331="` echo /root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331 `" ) && sleep 0' 44109 1727204243.87757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204243.87772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204243.87792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.87903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.88182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.88296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.90456: stdout chunk (state=3): >>>ansible-tmp-1727204243.8709571-46091-174287264914331=/root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331 <<< 44109 1727204243.90632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.90636: stdout chunk (state=3): >>><<< 44109 1727204243.90638: stderr chunk (state=3): >>><<< 44109 1727204243.90657: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204243.8709571-46091-174287264914331=/root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.90697: variable 'ansible_module_compression' from source: unknown 44109 1727204243.90846: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204243.90849: variable 'ansible_facts' from source: unknown 44109 1727204243.90897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/AnsiballZ_command.py 44109 1727204243.91092: Sending initial data 44109 1727204243.91101: Sent initial data (156 bytes) 44109 1727204243.91703: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204243.91773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.91816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.91836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.91862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.91990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.93743: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204243.93818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204243.93890: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp53sglmoo /root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/AnsiballZ_command.py <<< 44109 1727204243.93897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/AnsiballZ_command.py" <<< 44109 1727204243.93962: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp53sglmoo" to remote "/root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/AnsiballZ_command.py" <<< 44109 1727204243.93966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/AnsiballZ_command.py" <<< 44109 1727204243.94625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.94666: stderr chunk (state=3): >>><<< 44109 1727204243.94669: stdout chunk (state=3): >>><<< 44109 1727204243.94715: done transferring module to remote 44109 1727204243.94721: _low_level_execute_command(): starting 44109 1727204243.94727: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/ /root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/AnsiballZ_command.py && sleep 0' 44109 1727204243.95147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.95182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204243.95186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.95188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204243.95190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204243.95245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.95255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.95259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.95337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204243.97352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204243.97360: stdout chunk (state=3): >>><<< 44109 1727204243.97363: stderr chunk (state=3): >>><<< 44109 1727204243.97582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204243.97586: _low_level_execute_command(): starting 44109 1727204243.97588: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/AnsiballZ_command.py && sleep 0' 44109 1727204243.98077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204243.98101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204243.98116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204243.98229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.15066: stdout chunk (state=3): >>> {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-24 14:57:24.145006", "end": "2024-09-24 14:57:24.148891", "delta": "0:00:00.003885", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204244.17062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204244.17066: stdout chunk (state=3): >>><<< 44109 1727204244.17068: stderr chunk (state=3): >>><<< 44109 1727204244.17101: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-24 14:57:24.145006", "end": "2024-09-24 14:57:24.148891", "delta": "0:00:00.003885", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204244.17138: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204244.17146: _low_level_execute_command(): starting 44109 1727204244.17151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204243.8709571-46091-174287264914331/ > /dev/null 2>&1 && sleep 0' 44109 1727204244.18436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.18637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.18815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.18819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.18947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.19089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.21174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.21181: stdout chunk (state=3): >>><<< 44109 1727204244.21183: stderr chunk (state=3): >>><<< 44109 1727204244.21585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204244.21588: handler run complete 44109 1727204244.21591: Evaluated conditional (False): False 44109 1727204244.21593: attempt loop complete, returning result 44109 1727204244.21595: _execute() done 44109 1727204244.21597: dumping result to json 44109 1727204244.21599: done dumping result, returning 44109 1727204244.21601: done running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 'custom' [028d2410-947f-ed67-a560-00000000005f] 44109 1727204244.21603: sending task result for task 028d2410-947f-ed67-a560-00000000005f 44109 1727204244.21906: done sending task result for task 028d2410-947f-ed67-a560-00000000005f 44109 1727204244.21910: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "custom" ], "delta": "0:00:00.003885", "end": "2024-09-24 14:57:24.148891", "rc": 0, "start": "2024-09-24 14:57:24.145006" } STDOUT: 200: from 198.51.100.56/26 lookup custom proto static 44109 1727204244.21987: no more pending results, returning what we have 44109 1727204244.21990: results queue empty 44109 1727204244.21991: checking for any_errors_fatal 44109 1727204244.21997: done checking for any_errors_fatal 44109 1727204244.21998: checking for max_fail_percentage 44109 1727204244.22000: done checking for max_fail_percentage 44109 1727204244.22000: checking to see if all hosts have failed and the running result is not ok 44109 1727204244.22001: done checking to see if all hosts have failed 44109 1727204244.22002: getting the remaining hosts for this loop 44109 1727204244.22003: done getting the remaining hosts for this loop 44109 1727204244.22006: getting the next task for host managed-node1 44109 1727204244.22011: done getting next task for host managed-node1 44109 1727204244.22014: ^ task is: TASK: Get the IPv4 routing rule for the connection "{{ interface }}" 44109 1727204244.22016: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204244.22019: getting variables 44109 1727204244.22021: in VariableManager get_vars() 44109 1727204244.22053: Calling all_inventory to load vars for managed-node1 44109 1727204244.22055: Calling groups_inventory to load vars for managed-node1 44109 1727204244.22058: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204244.22067: Calling all_plugins_play to load vars for managed-node1 44109 1727204244.22070: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204244.22073: Calling groups_plugins_play to load vars for managed-node1 44109 1727204244.28071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204244.29658: done with get_vars() 44109 1727204244.29688: done getting variables 44109 1727204244.29741: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204244.29848: variable 'interface' from source: set_fact TASK [Get the IPv4 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:143 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.479) 0:00:21.095 ***** 44109 1727204244.29873: entering _queue_task() for managed-node1/command 44109 1727204244.30232: worker is 1 (out of 1 available) 44109 1727204244.30245: exiting _queue_task() for managed-node1/command 44109 1727204244.30257: done queuing things up, now waiting for results queue to drain 44109 1727204244.30258: waiting for pending results... 44109 1727204244.30567: running TaskExecutor() for managed-node1/TASK: Get the IPv4 routing rule for the connection "ethtest0" 44109 1727204244.30656: in run() - task 028d2410-947f-ed67-a560-000000000060 44109 1727204244.30670: variable 'ansible_search_path' from source: unknown 44109 1727204244.30708: calling self._execute() 44109 1727204244.30814: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204244.30819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204244.30827: variable 'omit' from source: magic vars 44109 1727204244.31218: variable 'ansible_distribution_major_version' from source: facts 44109 1727204244.31229: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204244.31235: variable 'omit' from source: magic vars 44109 1727204244.31258: variable 'omit' from source: magic vars 44109 1727204244.31362: variable 'interface' from source: set_fact 44109 1727204244.31382: variable 'omit' from source: magic vars 44109 1727204244.31428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204244.31462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204244.31481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204244.31497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204244.31517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204244.31546: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204244.31549: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204244.31552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204244.31656: Set connection var ansible_connection to ssh 44109 1727204244.31659: Set connection var ansible_timeout to 10 44109 1727204244.31781: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204244.31785: Set connection var ansible_pipelining to False 44109 1727204244.31787: Set connection var ansible_shell_executable to /bin/sh 44109 1727204244.31789: Set connection var ansible_shell_type to sh 44109 1727204244.31792: variable 'ansible_shell_executable' from source: unknown 44109 1727204244.31794: variable 'ansible_connection' from source: unknown 44109 1727204244.31797: variable 'ansible_module_compression' from source: unknown 44109 1727204244.31800: variable 'ansible_shell_type' from source: unknown 44109 1727204244.31802: variable 'ansible_shell_executable' from source: unknown 44109 1727204244.31804: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204244.31806: variable 'ansible_pipelining' from source: unknown 44109 1727204244.31809: variable 'ansible_timeout' from source: unknown 44109 1727204244.31811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204244.31869: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204244.31880: variable 'omit' from source: magic vars 44109 1727204244.31885: starting attempt loop 44109 1727204244.31888: running the handler 44109 1727204244.31904: _low_level_execute_command(): starting 44109 1727204244.31916: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204244.32707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.32735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.32751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.32775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.32900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.34683: stdout chunk (state=3): >>>/root <<< 44109 1727204244.34829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.34842: stderr chunk (state=3): >>><<< 44109 1727204244.34850: stdout chunk (state=3): >>><<< 44109 1727204244.34881: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204244.34905: _low_level_execute_command(): starting 44109 1727204244.34986: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112 `" && echo ansible-tmp-1727204244.3489206-46119-19540473724112="` echo /root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112 `" ) && sleep 0' 44109 1727204244.35557: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204244.35570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.35586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.35643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.35718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.35749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.35763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.35879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.38015: stdout chunk (state=3): >>>ansible-tmp-1727204244.3489206-46119-19540473724112=/root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112 <<< 44109 1727204244.38283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.38287: stdout chunk (state=3): >>><<< 44109 1727204244.38289: stderr chunk (state=3): >>><<< 44109 1727204244.38292: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204244.3489206-46119-19540473724112=/root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204244.38294: variable 'ansible_module_compression' from source: unknown 44109 1727204244.38309: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204244.38358: variable 'ansible_facts' from source: unknown 44109 1727204244.38464: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/AnsiballZ_command.py 44109 1727204244.38660: Sending initial data 44109 1727204244.38663: Sent initial data (155 bytes) 44109 1727204244.39592: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.39722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.39796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.39820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.39962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.41716: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44109 1727204244.41731: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44109 1727204244.41745: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 44109 1727204244.41756: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 44109 1727204244.41768: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 44109 1727204244.41782: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 44109 1727204244.41799: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204244.41899: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204244.41986: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpl4tiegs0 /root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/AnsiballZ_command.py <<< 44109 1727204244.41996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/AnsiballZ_command.py" <<< 44109 1727204244.42050: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpl4tiegs0" to remote "/root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/AnsiballZ_command.py" <<< 44109 1727204244.43772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.43863: stdout chunk (state=3): >>><<< 44109 1727204244.43866: stderr chunk (state=3): >>><<< 44109 1727204244.43869: done transferring module to remote 44109 1727204244.43871: _low_level_execute_command(): starting 44109 1727204244.43884: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/ /root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/AnsiballZ_command.py && sleep 0' 44109 1727204244.44547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.44550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.44552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.44554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204244.44557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.44605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.44635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.44722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.46806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.46810: stdout chunk (state=3): >>><<< 44109 1727204244.46817: stderr chunk (state=3): >>><<< 44109 1727204244.46914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204244.46918: _low_level_execute_command(): starting 44109 1727204244.46921: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/AnsiballZ_command.py && sleep 0' 44109 1727204244.47484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204244.47488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.47491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.47670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204244.47674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204244.47679: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204244.47681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.47684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.47687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.47760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.66192: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 14:57:24.643137", "end": "2024-09-24 14:57:24.660062", "delta": "0:00:00.016925", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204244.68018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204244.68046: stderr chunk (state=3): >>><<< 44109 1727204244.68050: stdout chunk (state=3): >>><<< 44109 1727204244.68068: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 14:57:24.643137", "end": "2024-09-24 14:57:24.660062", "delta": "0:00:00.016925", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204244.68099: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv4.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204244.68108: _low_level_execute_command(): starting 44109 1727204244.68114: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204244.3489206-46119-19540473724112/ > /dev/null 2>&1 && sleep 0' 44109 1727204244.68542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.68546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204244.68573: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.68579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204244.68582: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.68585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.68645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.68651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.68654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.68730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.70687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.70714: stderr chunk (state=3): >>><<< 44109 1727204244.70718: stdout chunk (state=3): >>><<< 44109 1727204244.70734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204244.70740: handler run complete 44109 1727204244.70758: Evaluated conditional (False): False 44109 1727204244.70767: attempt loop complete, returning result 44109 1727204244.70769: _execute() done 44109 1727204244.70772: dumping result to json 44109 1727204244.70778: done dumping result, returning 44109 1727204244.70786: done running TaskExecutor() for managed-node1/TASK: Get the IPv4 routing rule for the connection "ethtest0" [028d2410-947f-ed67-a560-000000000060] 44109 1727204244.70789: sending task result for task 028d2410-947f-ed67-a560-000000000060 44109 1727204244.70884: done sending task result for task 028d2410-947f-ed67-a560-000000000060 44109 1727204244.70887: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.016925", "end": "2024-09-24 14:57:24.660062", "rc": 0, "start": "2024-09-24 14:57:24.643137" } STDOUT: ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200 44109 1727204244.70959: no more pending results, returning what we have 44109 1727204244.70963: results queue empty 44109 1727204244.70963: checking for any_errors_fatal 44109 1727204244.70973: done checking for any_errors_fatal 44109 1727204244.70974: checking for max_fail_percentage 44109 1727204244.70977: done checking for max_fail_percentage 44109 1727204244.70978: checking to see if all hosts have failed and the running result is not ok 44109 1727204244.70979: done checking to see if all hosts have failed 44109 1727204244.70980: getting the remaining hosts for this loop 44109 1727204244.70981: done getting the remaining hosts for this loop 44109 1727204244.70984: getting the next task for host managed-node1 44109 1727204244.70990: done getting next task for host managed-node1 44109 1727204244.70992: ^ task is: TASK: Get the IPv6 routing rule for the connection "{{ interface }}" 44109 1727204244.70994: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204244.70999: getting variables 44109 1727204244.71001: in VariableManager get_vars() 44109 1727204244.71037: Calling all_inventory to load vars for managed-node1 44109 1727204244.71040: Calling groups_inventory to load vars for managed-node1 44109 1727204244.71042: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204244.71052: Calling all_plugins_play to load vars for managed-node1 44109 1727204244.71055: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204244.71057: Calling groups_plugins_play to load vars for managed-node1 44109 1727204244.71856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204244.72811: done with get_vars() 44109 1727204244.72829: done getting variables 44109 1727204244.72872: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204244.72964: variable 'interface' from source: set_fact TASK [Get the IPv6 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:149 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.431) 0:00:21.526 ***** 44109 1727204244.72987: entering _queue_task() for managed-node1/command 44109 1727204244.73241: worker is 1 (out of 1 available) 44109 1727204244.73254: exiting _queue_task() for managed-node1/command 44109 1727204244.73266: done queuing things up, now waiting for results queue to drain 44109 1727204244.73267: waiting for pending results... 44109 1727204244.73450: running TaskExecutor() for managed-node1/TASK: Get the IPv6 routing rule for the connection "ethtest0" 44109 1727204244.73522: in run() - task 028d2410-947f-ed67-a560-000000000061 44109 1727204244.73533: variable 'ansible_search_path' from source: unknown 44109 1727204244.73563: calling self._execute() 44109 1727204244.73641: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204244.73645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204244.73656: variable 'omit' from source: magic vars 44109 1727204244.73933: variable 'ansible_distribution_major_version' from source: facts 44109 1727204244.73941: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204244.73948: variable 'omit' from source: magic vars 44109 1727204244.73964: variable 'omit' from source: magic vars 44109 1727204244.74034: variable 'interface' from source: set_fact 44109 1727204244.74050: variable 'omit' from source: magic vars 44109 1727204244.74086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204244.74114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204244.74128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204244.74142: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204244.74153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204244.74178: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204244.74182: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204244.74185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204244.74254: Set connection var ansible_connection to ssh 44109 1727204244.74258: Set connection var ansible_timeout to 10 44109 1727204244.74268: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204244.74271: Set connection var ansible_pipelining to False 44109 1727204244.74273: Set connection var ansible_shell_executable to /bin/sh 44109 1727204244.74280: Set connection var ansible_shell_type to sh 44109 1727204244.74296: variable 'ansible_shell_executable' from source: unknown 44109 1727204244.74299: variable 'ansible_connection' from source: unknown 44109 1727204244.74302: variable 'ansible_module_compression' from source: unknown 44109 1727204244.74306: variable 'ansible_shell_type' from source: unknown 44109 1727204244.74308: variable 'ansible_shell_executable' from source: unknown 44109 1727204244.74310: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204244.74315: variable 'ansible_pipelining' from source: unknown 44109 1727204244.74318: variable 'ansible_timeout' from source: unknown 44109 1727204244.74320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204244.74421: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204244.74430: variable 'omit' from source: magic vars 44109 1727204244.74433: starting attempt loop 44109 1727204244.74436: running the handler 44109 1727204244.74450: _low_level_execute_command(): starting 44109 1727204244.74457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204244.74964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.74995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204244.74998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.75001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.75003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.75060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.75064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.75071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.75150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.76910: stdout chunk (state=3): >>>/root <<< 44109 1727204244.77011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.77049: stderr chunk (state=3): >>><<< 44109 1727204244.77051: stdout chunk (state=3): >>><<< 44109 1727204244.77067: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204244.77087: _low_level_execute_command(): starting 44109 1727204244.77090: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167 `" && echo ansible-tmp-1727204244.7707157-46146-250068614865167="` echo /root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167 `" ) && sleep 0' 44109 1727204244.77545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.77550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204244.77553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.77563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204244.77566: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.77569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.77612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.77615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.77621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.77704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.79795: stdout chunk (state=3): >>>ansible-tmp-1727204244.7707157-46146-250068614865167=/root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167 <<< 44109 1727204244.79904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.79933: stderr chunk (state=3): >>><<< 44109 1727204244.79938: stdout chunk (state=3): >>><<< 44109 1727204244.79957: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204244.7707157-46146-250068614865167=/root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204244.79984: variable 'ansible_module_compression' from source: unknown 44109 1727204244.80026: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204244.80064: variable 'ansible_facts' from source: unknown 44109 1727204244.80122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/AnsiballZ_command.py 44109 1727204244.80224: Sending initial data 44109 1727204244.80227: Sent initial data (156 bytes) 44109 1727204244.80652: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.80687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204244.80690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204244.80692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.80694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.80696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204244.80698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.80751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.80754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.80759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.80838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.82581: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 44109 1727204244.82584: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204244.82645: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204244.82722: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmplgg3u8ge /root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/AnsiballZ_command.py <<< 44109 1727204244.82726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/AnsiballZ_command.py" <<< 44109 1727204244.82798: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmplgg3u8ge" to remote "/root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/AnsiballZ_command.py" <<< 44109 1727204244.82801: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/AnsiballZ_command.py" <<< 44109 1727204244.83581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.83585: stdout chunk (state=3): >>><<< 44109 1727204244.83587: stderr chunk (state=3): >>><<< 44109 1727204244.83652: done transferring module to remote 44109 1727204244.83736: _low_level_execute_command(): starting 44109 1727204244.83739: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/ /root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/AnsiballZ_command.py && sleep 0' 44109 1727204244.84218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204244.84233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.84246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.84265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204244.84296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.84381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.84403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.84516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204244.86692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204244.86695: stdout chunk (state=3): >>><<< 44109 1727204244.86717: stderr chunk (state=3): >>><<< 44109 1727204244.86721: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204244.86724: _low_level_execute_command(): starting 44109 1727204244.86782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/AnsiballZ_command.py && sleep 0' 44109 1727204244.87333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204244.87344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.87354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.87370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204244.87383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204244.87391: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204244.87401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.87418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204244.87476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204244.87481: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204244.87483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204244.87485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204244.87492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204244.87525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204244.87542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204244.87553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204244.87724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204245.05643: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 14:57:25.038540", "end": "2024-09-24 14:57:25.054626", "delta": "0:00:00.016086", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204245.07431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204245.07458: stderr chunk (state=3): >>><<< 44109 1727204245.07462: stdout chunk (state=3): >>><<< 44109 1727204245.07480: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 14:57:25.038540", "end": "2024-09-24 14:57:25.054626", "delta": "0:00:00.016086", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204245.07512: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv6.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204245.07520: _low_level_execute_command(): starting 44109 1727204245.07526: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204244.7707157-46146-250068614865167/ > /dev/null 2>&1 && sleep 0' 44109 1727204245.07961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204245.07995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204245.07998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204245.08000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204245.08003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204245.08005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204245.08060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204245.08065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204245.08067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204245.08142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204245.10095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204245.10119: stderr chunk (state=3): >>><<< 44109 1727204245.10122: stdout chunk (state=3): >>><<< 44109 1727204245.10136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204245.10142: handler run complete 44109 1727204245.10161: Evaluated conditional (False): False 44109 1727204245.10170: attempt loop complete, returning result 44109 1727204245.10173: _execute() done 44109 1727204245.10176: dumping result to json 44109 1727204245.10181: done dumping result, returning 44109 1727204245.10189: done running TaskExecutor() for managed-node1/TASK: Get the IPv6 routing rule for the connection "ethtest0" [028d2410-947f-ed67-a560-000000000061] 44109 1727204245.10192: sending task result for task 028d2410-947f-ed67-a560-000000000061 44109 1727204245.10293: done sending task result for task 028d2410-947f-ed67-a560-000000000061 44109 1727204245.10296: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.016086", "end": "2024-09-24 14:57:25.054626", "rc": 0, "start": "2024-09-24 14:57:25.038540" } STDOUT: ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600 44109 1727204245.10402: no more pending results, returning what we have 44109 1727204245.10407: results queue empty 44109 1727204245.10408: checking for any_errors_fatal 44109 1727204245.10419: done checking for any_errors_fatal 44109 1727204245.10419: checking for max_fail_percentage 44109 1727204245.10421: done checking for max_fail_percentage 44109 1727204245.10422: checking to see if all hosts have failed and the running result is not ok 44109 1727204245.10423: done checking to see if all hosts have failed 44109 1727204245.10423: getting the remaining hosts for this loop 44109 1727204245.10425: done getting the remaining hosts for this loop 44109 1727204245.10428: getting the next task for host managed-node1 44109 1727204245.10433: done getting next task for host managed-node1 44109 1727204245.10435: ^ task is: TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 44109 1727204245.10437: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204245.10440: getting variables 44109 1727204245.10442: in VariableManager get_vars() 44109 1727204245.10473: Calling all_inventory to load vars for managed-node1 44109 1727204245.10477: Calling groups_inventory to load vars for managed-node1 44109 1727204245.10479: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204245.10488: Calling all_plugins_play to load vars for managed-node1 44109 1727204245.10490: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204245.10493: Calling groups_plugins_play to load vars for managed-node1 44109 1727204245.11386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204245.13057: done with get_vars() 44109 1727204245.13087: done getting variables 44109 1727204245.13164: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30200 matches the specified rule] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:155 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.402) 0:00:21.928 ***** 44109 1727204245.13197: entering _queue_task() for managed-node1/assert 44109 1727204245.13688: worker is 1 (out of 1 available) 44109 1727204245.13699: exiting _queue_task() for managed-node1/assert 44109 1727204245.13710: done queuing things up, now waiting for results queue to drain 44109 1727204245.13711: waiting for pending results... 44109 1727204245.14002: running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 44109 1727204245.14342: in run() - task 028d2410-947f-ed67-a560-000000000062 44109 1727204245.14346: variable 'ansible_search_path' from source: unknown 44109 1727204245.14349: calling self._execute() 44109 1727204245.14352: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.14354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.14357: variable 'omit' from source: magic vars 44109 1727204245.14673: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.14688: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204245.14828: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.14834: Evaluated conditional (ansible_distribution_major_version != "7"): True 44109 1727204245.14841: variable 'omit' from source: magic vars 44109 1727204245.14865: variable 'omit' from source: magic vars 44109 1727204245.14921: variable 'omit' from source: magic vars 44109 1727204245.14966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204245.15016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204245.15039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204245.15058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.15070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.15114: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204245.15118: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.15121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.15237: Set connection var ansible_connection to ssh 44109 1727204245.15249: Set connection var ansible_timeout to 10 44109 1727204245.15256: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204245.15263: Set connection var ansible_pipelining to False 44109 1727204245.15269: Set connection var ansible_shell_executable to /bin/sh 44109 1727204245.15274: Set connection var ansible_shell_type to sh 44109 1727204245.15381: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.15385: variable 'ansible_connection' from source: unknown 44109 1727204245.15388: variable 'ansible_module_compression' from source: unknown 44109 1727204245.15390: variable 'ansible_shell_type' from source: unknown 44109 1727204245.15392: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.15395: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.15397: variable 'ansible_pipelining' from source: unknown 44109 1727204245.15399: variable 'ansible_timeout' from source: unknown 44109 1727204245.15401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.15499: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204245.15510: variable 'omit' from source: magic vars 44109 1727204245.15516: starting attempt loop 44109 1727204245.15519: running the handler 44109 1727204245.15702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204245.15895: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204245.15926: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204245.15990: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204245.16017: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204245.16078: variable 'route_rule_table_30200' from source: set_fact 44109 1727204245.16103: Evaluated conditional (route_rule_table_30200.stdout is search("30200:(\s+)from 198.51.100.58/26 lookup 30200")): True 44109 1727204245.16197: variable 'route_rule_table_30200' from source: set_fact 44109 1727204245.16217: Evaluated conditional (route_rule_table_30200.stdout is search("30201:(\s+)from all fwmark 0x1/0x1 lookup 30200")): True 44109 1727204245.16306: variable 'route_rule_table_30200' from source: set_fact 44109 1727204245.16327: Evaluated conditional (route_rule_table_30200.stdout is search("30202:(\s+)from all ipproto tcp lookup 30200")): True 44109 1727204245.16416: variable 'route_rule_table_30200' from source: set_fact 44109 1727204245.16433: Evaluated conditional (route_rule_table_30200.stdout is search("30203:(\s+)from all sport 128-256 lookup 30200")): True 44109 1727204245.16523: variable 'route_rule_table_30200' from source: set_fact 44109 1727204245.16543: Evaluated conditional (route_rule_table_30200.stdout is search("30204:(\s+)from all tos (0x08|throughput) lookup 30200")): True 44109 1727204245.16550: handler run complete 44109 1727204245.16562: attempt loop complete, returning result 44109 1727204245.16565: _execute() done 44109 1727204245.16567: dumping result to json 44109 1727204245.16570: done dumping result, returning 44109 1727204245.16577: done running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule [028d2410-947f-ed67-a560-000000000062] 44109 1727204245.16579: sending task result for task 028d2410-947f-ed67-a560-000000000062 44109 1727204245.16666: done sending task result for task 028d2410-947f-ed67-a560-000000000062 44109 1727204245.16668: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204245.16718: no more pending results, returning what we have 44109 1727204245.16722: results queue empty 44109 1727204245.16723: checking for any_errors_fatal 44109 1727204245.16733: done checking for any_errors_fatal 44109 1727204245.16734: checking for max_fail_percentage 44109 1727204245.16735: done checking for max_fail_percentage 44109 1727204245.16736: checking to see if all hosts have failed and the running result is not ok 44109 1727204245.16737: done checking to see if all hosts have failed 44109 1727204245.16737: getting the remaining hosts for this loop 44109 1727204245.16738: done getting the remaining hosts for this loop 44109 1727204245.16742: getting the next task for host managed-node1 44109 1727204245.16748: done getting next task for host managed-node1 44109 1727204245.16750: ^ task is: TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 44109 1727204245.16751: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204245.16754: getting variables 44109 1727204245.16756: in VariableManager get_vars() 44109 1727204245.16793: Calling all_inventory to load vars for managed-node1 44109 1727204245.16796: Calling groups_inventory to load vars for managed-node1 44109 1727204245.16798: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204245.16816: Calling all_plugins_play to load vars for managed-node1 44109 1727204245.16819: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204245.16821: Calling groups_plugins_play to load vars for managed-node1 44109 1727204245.17855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204245.19165: done with get_vars() 44109 1727204245.19187: done getting variables 44109 1727204245.19233: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30400 matches the specified rule] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:166 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.060) 0:00:21.988 ***** 44109 1727204245.19253: entering _queue_task() for managed-node1/assert 44109 1727204245.19514: worker is 1 (out of 1 available) 44109 1727204245.19527: exiting _queue_task() for managed-node1/assert 44109 1727204245.19538: done queuing things up, now waiting for results queue to drain 44109 1727204245.19539: waiting for pending results... 44109 1727204245.19723: running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 44109 1727204245.19794: in run() - task 028d2410-947f-ed67-a560-000000000063 44109 1727204245.19805: variable 'ansible_search_path' from source: unknown 44109 1727204245.19835: calling self._execute() 44109 1727204245.19918: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.19922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.19930: variable 'omit' from source: magic vars 44109 1727204245.20207: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.20217: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204245.20292: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.20296: Evaluated conditional (ansible_distribution_major_version != "7"): True 44109 1727204245.20305: variable 'omit' from source: magic vars 44109 1727204245.20324: variable 'omit' from source: magic vars 44109 1727204245.20350: variable 'omit' from source: magic vars 44109 1727204245.20383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204245.20409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204245.20428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204245.20442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.20452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.20478: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204245.20481: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.20484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.20555: Set connection var ansible_connection to ssh 44109 1727204245.20558: Set connection var ansible_timeout to 10 44109 1727204245.20564: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204245.20570: Set connection var ansible_pipelining to False 44109 1727204245.20577: Set connection var ansible_shell_executable to /bin/sh 44109 1727204245.20582: Set connection var ansible_shell_type to sh 44109 1727204245.20598: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.20601: variable 'ansible_connection' from source: unknown 44109 1727204245.20603: variable 'ansible_module_compression' from source: unknown 44109 1727204245.20648: variable 'ansible_shell_type' from source: unknown 44109 1727204245.20651: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.20653: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.20654: variable 'ansible_pipelining' from source: unknown 44109 1727204245.20656: variable 'ansible_timeout' from source: unknown 44109 1727204245.20658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.20885: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204245.20889: variable 'omit' from source: magic vars 44109 1727204245.20891: starting attempt loop 44109 1727204245.20894: running the handler 44109 1727204245.21006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204245.21234: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204245.21278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204245.21363: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204245.21402: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204245.21498: variable 'route_rule_table_30400' from source: set_fact 44109 1727204245.21538: Evaluated conditional (route_rule_table_30400.stdout is search("30400:(\s+)from all to 198.51.100.128/26 lookup 30400")): True 44109 1727204245.21691: variable 'route_rule_table_30400' from source: set_fact 44109 1727204245.21726: Evaluated conditional (route_rule_table_30400.stdout is search("30401:(\s+)from all iif iiftest \[detached\] lookup 30400")): True 44109 1727204245.21882: variable 'route_rule_table_30400' from source: set_fact 44109 1727204245.21914: Evaluated conditional (route_rule_table_30400.stdout is search("30402:(\s+)from all oif oiftest \[detached\] lookup 30400")): True 44109 1727204245.21926: handler run complete 44109 1727204245.21943: attempt loop complete, returning result 44109 1727204245.21949: _execute() done 44109 1727204245.21955: dumping result to json 44109 1727204245.21961: done dumping result, returning 44109 1727204245.21980: done running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule [028d2410-947f-ed67-a560-000000000063] 44109 1727204245.21989: sending task result for task 028d2410-947f-ed67-a560-000000000063 44109 1727204245.22149: done sending task result for task 028d2410-947f-ed67-a560-000000000063 44109 1727204245.22152: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204245.22208: no more pending results, returning what we have 44109 1727204245.22212: results queue empty 44109 1727204245.22213: checking for any_errors_fatal 44109 1727204245.22222: done checking for any_errors_fatal 44109 1727204245.22223: checking for max_fail_percentage 44109 1727204245.22225: done checking for max_fail_percentage 44109 1727204245.22226: checking to see if all hosts have failed and the running result is not ok 44109 1727204245.22227: done checking to see if all hosts have failed 44109 1727204245.22227: getting the remaining hosts for this loop 44109 1727204245.22229: done getting the remaining hosts for this loop 44109 1727204245.22232: getting the next task for host managed-node1 44109 1727204245.22238: done getting next task for host managed-node1 44109 1727204245.22241: ^ task is: TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 44109 1727204245.22243: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204245.22247: getting variables 44109 1727204245.22248: in VariableManager get_vars() 44109 1727204245.22289: Calling all_inventory to load vars for managed-node1 44109 1727204245.22293: Calling groups_inventory to load vars for managed-node1 44109 1727204245.22296: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204245.22308: Calling all_plugins_play to load vars for managed-node1 44109 1727204245.22311: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204245.22314: Calling groups_plugins_play to load vars for managed-node1 44109 1727204245.23968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204245.26296: done with get_vars() 44109 1727204245.26325: done getting variables 44109 1727204245.26389: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30600 matches the specified rule] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:175 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.071) 0:00:22.060 ***** 44109 1727204245.26419: entering _queue_task() for managed-node1/assert 44109 1727204245.26775: worker is 1 (out of 1 available) 44109 1727204245.26790: exiting _queue_task() for managed-node1/assert 44109 1727204245.26801: done queuing things up, now waiting for results queue to drain 44109 1727204245.26802: waiting for pending results... 44109 1727204245.27324: running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 44109 1727204245.27331: in run() - task 028d2410-947f-ed67-a560-000000000064 44109 1727204245.27334: variable 'ansible_search_path' from source: unknown 44109 1727204245.27372: calling self._execute() 44109 1727204245.27489: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.27500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.27518: variable 'omit' from source: magic vars 44109 1727204245.27921: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.27938: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204245.28078: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.28082: Evaluated conditional (ansible_distribution_major_version != "7"): True 44109 1727204245.28086: variable 'omit' from source: magic vars 44109 1727204245.28185: variable 'omit' from source: magic vars 44109 1727204245.28189: variable 'omit' from source: magic vars 44109 1727204245.28205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204245.28244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204245.28265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204245.28293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.28311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.28346: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204245.28353: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.28359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.28469: Set connection var ansible_connection to ssh 44109 1727204245.28484: Set connection var ansible_timeout to 10 44109 1727204245.28495: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204245.28515: Set connection var ansible_pipelining to False 44109 1727204245.28526: Set connection var ansible_shell_executable to /bin/sh 44109 1727204245.28534: Set connection var ansible_shell_type to sh 44109 1727204245.28619: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.28622: variable 'ansible_connection' from source: unknown 44109 1727204245.28624: variable 'ansible_module_compression' from source: unknown 44109 1727204245.28625: variable 'ansible_shell_type' from source: unknown 44109 1727204245.28627: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.28628: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.28630: variable 'ansible_pipelining' from source: unknown 44109 1727204245.28632: variable 'ansible_timeout' from source: unknown 44109 1727204245.28633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.28735: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204245.28752: variable 'omit' from source: magic vars 44109 1727204245.28761: starting attempt loop 44109 1727204245.28767: running the handler 44109 1727204245.28935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204245.29196: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204245.29242: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204245.29326: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204245.29386: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204245.29494: variable 'route_rule_table_30600' from source: set_fact 44109 1727204245.29600: Evaluated conditional (route_rule_table_30600.stdout is search("30600:(\s+)from all to 2001:db8::4/32 lookup 30600")): True 44109 1727204245.29681: variable 'route_rule_table_30600' from source: set_fact 44109 1727204245.29719: Evaluated conditional (route_rule_table_30600.stdout is search("30601:(\s+)not from all dport 128-256 lookup 30600")): True 44109 1727204245.29729: handler run complete 44109 1727204245.29747: attempt loop complete, returning result 44109 1727204245.29753: _execute() done 44109 1727204245.29759: dumping result to json 44109 1727204245.29765: done dumping result, returning 44109 1727204245.29780: done running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule [028d2410-947f-ed67-a560-000000000064] 44109 1727204245.29789: sending task result for task 028d2410-947f-ed67-a560-000000000064 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204245.29970: no more pending results, returning what we have 44109 1727204245.29974: results queue empty 44109 1727204245.30077: checking for any_errors_fatal 44109 1727204245.30087: done checking for any_errors_fatal 44109 1727204245.30088: checking for max_fail_percentage 44109 1727204245.30090: done checking for max_fail_percentage 44109 1727204245.30091: checking to see if all hosts have failed and the running result is not ok 44109 1727204245.30092: done checking to see if all hosts have failed 44109 1727204245.30092: getting the remaining hosts for this loop 44109 1727204245.30094: done getting the remaining hosts for this loop 44109 1727204245.30098: getting the next task for host managed-node1 44109 1727204245.30104: done getting next task for host managed-node1 44109 1727204245.30107: ^ task is: TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 44109 1727204245.30109: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204245.30113: getting variables 44109 1727204245.30116: in VariableManager get_vars() 44109 1727204245.30156: Calling all_inventory to load vars for managed-node1 44109 1727204245.30158: Calling groups_inventory to load vars for managed-node1 44109 1727204245.30161: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204245.30172: Calling all_plugins_play to load vars for managed-node1 44109 1727204245.30175: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204245.30296: Calling groups_plugins_play to load vars for managed-node1 44109 1727204245.30890: done sending task result for task 028d2410-947f-ed67-a560-000000000064 44109 1727204245.30893: WORKER PROCESS EXITING 44109 1727204245.31890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204245.34779: done with get_vars() 44109 1727204245.34809: done getting variables 44109 1727204245.34870: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with 'custom' table lookup matches the specified rule] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:183 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.084) 0:00:22.145 ***** 44109 1727204245.34904: entering _queue_task() for managed-node1/assert 44109 1727204245.35643: worker is 1 (out of 1 available) 44109 1727204245.35657: exiting _queue_task() for managed-node1/assert 44109 1727204245.35669: done queuing things up, now waiting for results queue to drain 44109 1727204245.35670: waiting for pending results... 44109 1727204245.36393: running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 44109 1727204245.36398: in run() - task 028d2410-947f-ed67-a560-000000000065 44109 1727204245.36402: variable 'ansible_search_path' from source: unknown 44109 1727204245.36782: calling self._execute() 44109 1727204245.36786: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.36788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.36791: variable 'omit' from source: magic vars 44109 1727204245.37498: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.37520: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204245.37841: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.37854: Evaluated conditional (ansible_distribution_major_version != "7"): True 44109 1727204245.37865: variable 'omit' from source: magic vars 44109 1727204245.37892: variable 'omit' from source: magic vars 44109 1727204245.37937: variable 'omit' from source: magic vars 44109 1727204245.38381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204245.38385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204245.38387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204245.38389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.38391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.38393: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204245.38395: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.38397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.38610: Set connection var ansible_connection to ssh 44109 1727204245.38625: Set connection var ansible_timeout to 10 44109 1727204245.38634: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204245.38644: Set connection var ansible_pipelining to False 44109 1727204245.38653: Set connection var ansible_shell_executable to /bin/sh 44109 1727204245.38661: Set connection var ansible_shell_type to sh 44109 1727204245.38690: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.38699: variable 'ansible_connection' from source: unknown 44109 1727204245.38708: variable 'ansible_module_compression' from source: unknown 44109 1727204245.38718: variable 'ansible_shell_type' from source: unknown 44109 1727204245.38724: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.38730: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.38737: variable 'ansible_pipelining' from source: unknown 44109 1727204245.38743: variable 'ansible_timeout' from source: unknown 44109 1727204245.38749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.39088: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204245.39105: variable 'omit' from source: magic vars 44109 1727204245.39117: starting attempt loop 44109 1727204245.39124: running the handler 44109 1727204245.39290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204245.39531: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204245.39578: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204245.39656: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204245.39696: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204245.39790: variable 'route_rule_table_custom' from source: set_fact 44109 1727204245.39828: Evaluated conditional (route_rule_table_custom.stdout is search("200:(\s+)from 198.51.100.56/26 lookup custom")): True 44109 1727204245.39838: handler run complete 44109 1727204245.39856: attempt loop complete, returning result 44109 1727204245.39862: _execute() done 44109 1727204245.39868: dumping result to json 44109 1727204245.39873: done dumping result, returning 44109 1727204245.39886: done running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule [028d2410-947f-ed67-a560-000000000065] 44109 1727204245.39897: sending task result for task 028d2410-947f-ed67-a560-000000000065 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204245.40050: no more pending results, returning what we have 44109 1727204245.40054: results queue empty 44109 1727204245.40055: checking for any_errors_fatal 44109 1727204245.40063: done checking for any_errors_fatal 44109 1727204245.40064: checking for max_fail_percentage 44109 1727204245.40066: done checking for max_fail_percentage 44109 1727204245.40067: checking to see if all hosts have failed and the running result is not ok 44109 1727204245.40068: done checking to see if all hosts have failed 44109 1727204245.40069: getting the remaining hosts for this loop 44109 1727204245.40070: done getting the remaining hosts for this loop 44109 1727204245.40073: getting the next task for host managed-node1 44109 1727204245.40081: done getting next task for host managed-node1 44109 1727204245.40084: ^ task is: TASK: Assert that the specified IPv4 routing rule was configured in the connection "{{ interface }}" 44109 1727204245.40086: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204245.40090: getting variables 44109 1727204245.40092: in VariableManager get_vars() 44109 1727204245.40134: Calling all_inventory to load vars for managed-node1 44109 1727204245.40137: Calling groups_inventory to load vars for managed-node1 44109 1727204245.40140: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204245.40151: Calling all_plugins_play to load vars for managed-node1 44109 1727204245.40154: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204245.40157: Calling groups_plugins_play to load vars for managed-node1 44109 1727204245.40989: done sending task result for task 028d2410-947f-ed67-a560-000000000065 44109 1727204245.40992: WORKER PROCESS EXITING 44109 1727204245.41842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204245.43582: done with get_vars() 44109 1727204245.43604: done getting variables 44109 1727204245.43665: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204245.43788: variable 'interface' from source: set_fact TASK [Assert that the specified IPv4 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:190 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.089) 0:00:22.234 ***** 44109 1727204245.43824: entering _queue_task() for managed-node1/assert 44109 1727204245.44160: worker is 1 (out of 1 available) 44109 1727204245.44172: exiting _queue_task() for managed-node1/assert 44109 1727204245.44388: done queuing things up, now waiting for results queue to drain 44109 1727204245.44390: waiting for pending results... 44109 1727204245.44480: running TaskExecutor() for managed-node1/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" 44109 1727204245.44592: in run() - task 028d2410-947f-ed67-a560-000000000066 44109 1727204245.44621: variable 'ansible_search_path' from source: unknown 44109 1727204245.44658: calling self._execute() 44109 1727204245.44764: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.44778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.44795: variable 'omit' from source: magic vars 44109 1727204245.45491: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.45495: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204245.45498: variable 'omit' from source: magic vars 44109 1727204245.45500: variable 'omit' from source: magic vars 44109 1727204245.45818: variable 'interface' from source: set_fact 44109 1727204245.45822: variable 'omit' from source: magic vars 44109 1727204245.45825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204245.45827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204245.45916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204245.45973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.46049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.46084: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204245.46092: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.46099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.46247: Set connection var ansible_connection to ssh 44109 1727204245.46472: Set connection var ansible_timeout to 10 44109 1727204245.46476: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204245.46479: Set connection var ansible_pipelining to False 44109 1727204245.46481: Set connection var ansible_shell_executable to /bin/sh 44109 1727204245.46482: Set connection var ansible_shell_type to sh 44109 1727204245.46484: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.46485: variable 'ansible_connection' from source: unknown 44109 1727204245.46487: variable 'ansible_module_compression' from source: unknown 44109 1727204245.46488: variable 'ansible_shell_type' from source: unknown 44109 1727204245.46490: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.46491: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.46493: variable 'ansible_pipelining' from source: unknown 44109 1727204245.46494: variable 'ansible_timeout' from source: unknown 44109 1727204245.46496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.46782: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204245.46804: variable 'omit' from source: magic vars 44109 1727204245.46981: starting attempt loop 44109 1727204245.46985: running the handler 44109 1727204245.47583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204245.47922: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204245.47973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204245.48071: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204245.48110: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204245.48211: variable 'connection_route_rule' from source: set_fact 44109 1727204245.48252: Evaluated conditional (connection_route_rule.stdout is search("priority 30200 from 198.51.100.58/26 table 30200")): True 44109 1727204245.48408: variable 'connection_route_rule' from source: set_fact 44109 1727204245.48441: Evaluated conditional (connection_route_rule.stdout is search("priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200")): True 44109 1727204245.48587: variable 'connection_route_rule' from source: set_fact 44109 1727204245.48617: Evaluated conditional (connection_route_rule.stdout is search("priority 30202 from 0.0.0.0/0 ipproto 6 table 30200")): True 44109 1727204245.48755: variable 'connection_route_rule' from source: set_fact 44109 1727204245.48789: Evaluated conditional (connection_route_rule.stdout is search("priority 30203 from 0.0.0.0/0 sport 128-256 table 30200")): True 44109 1727204245.48928: variable 'connection_route_rule' from source: set_fact 44109 1727204245.48956: Evaluated conditional (connection_route_rule.stdout is search("priority 30204 from 0.0.0.0/0 tos 0x08 table 30200")): True 44109 1727204245.49092: variable 'connection_route_rule' from source: set_fact 44109 1727204245.49128: Evaluated conditional (connection_route_rule.stdout is search("priority 30400 to 198.51.100.128/26 table 30400")): True 44109 1727204245.49262: variable 'connection_route_rule' from source: set_fact 44109 1727204245.49293: Evaluated conditional (connection_route_rule.stdout is search("priority 30401 from 0.0.0.0/0 iif iiftest table 30400")): True 44109 1727204245.49432: variable 'connection_route_rule' from source: set_fact 44109 1727204245.49462: Evaluated conditional (connection_route_rule.stdout is search("priority 30402 from 0.0.0.0/0 oif oiftest table 30400")): True 44109 1727204245.49603: variable 'connection_route_rule' from source: set_fact 44109 1727204245.49631: Evaluated conditional (connection_route_rule.stdout is search("priority 30403 from 0.0.0.0/0 table 30400")): True 44109 1727204245.49777: variable 'connection_route_rule' from source: set_fact 44109 1727204245.49803: Evaluated conditional (connection_route_rule.stdout is search("priority 200 from 198.51.100.56/26 table 200")): True 44109 1727204245.49819: handler run complete 44109 1727204245.49838: attempt loop complete, returning result 44109 1727204245.49878: _execute() done 44109 1727204245.49881: dumping result to json 44109 1727204245.49883: done dumping result, returning 44109 1727204245.49886: done running TaskExecutor() for managed-node1/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" [028d2410-947f-ed67-a560-000000000066] 44109 1727204245.49888: sending task result for task 028d2410-947f-ed67-a560-000000000066 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204245.50141: no more pending results, returning what we have 44109 1727204245.50145: results queue empty 44109 1727204245.50146: checking for any_errors_fatal 44109 1727204245.50155: done checking for any_errors_fatal 44109 1727204245.50156: checking for max_fail_percentage 44109 1727204245.50158: done checking for max_fail_percentage 44109 1727204245.50158: checking to see if all hosts have failed and the running result is not ok 44109 1727204245.50159: done checking to see if all hosts have failed 44109 1727204245.50160: getting the remaining hosts for this loop 44109 1727204245.50161: done getting the remaining hosts for this loop 44109 1727204245.50172: getting the next task for host managed-node1 44109 1727204245.50180: done getting next task for host managed-node1 44109 1727204245.50184: ^ task is: TASK: Assert that the specified IPv6 routing rule was configured in the connection "{{ interface }}" 44109 1727204245.50186: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204245.50191: getting variables 44109 1727204245.50193: in VariableManager get_vars() 44109 1727204245.50236: Calling all_inventory to load vars for managed-node1 44109 1727204245.50239: Calling groups_inventory to load vars for managed-node1 44109 1727204245.50241: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204245.50252: Calling all_plugins_play to load vars for managed-node1 44109 1727204245.50255: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204245.50258: Calling groups_plugins_play to load vars for managed-node1 44109 1727204245.50789: done sending task result for task 028d2410-947f-ed67-a560-000000000066 44109 1727204245.50792: WORKER PROCESS EXITING 44109 1727204245.51837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204245.53424: done with get_vars() 44109 1727204245.53451: done getting variables 44109 1727204245.53517: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204245.53640: variable 'interface' from source: set_fact TASK [Assert that the specified IPv6 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:205 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.098) 0:00:22.332 ***** 44109 1727204245.53669: entering _queue_task() for managed-node1/assert 44109 1727204245.54033: worker is 1 (out of 1 available) 44109 1727204245.54045: exiting _queue_task() for managed-node1/assert 44109 1727204245.54057: done queuing things up, now waiting for results queue to drain 44109 1727204245.54058: waiting for pending results... 44109 1727204245.54354: running TaskExecutor() for managed-node1/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" 44109 1727204245.54464: in run() - task 028d2410-947f-ed67-a560-000000000067 44109 1727204245.54488: variable 'ansible_search_path' from source: unknown 44109 1727204245.54534: calling self._execute() 44109 1727204245.54641: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.54881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.54885: variable 'omit' from source: magic vars 44109 1727204245.55050: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.55066: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204245.55079: variable 'omit' from source: magic vars 44109 1727204245.55109: variable 'omit' from source: magic vars 44109 1727204245.55221: variable 'interface' from source: set_fact 44109 1727204245.55244: variable 'omit' from source: magic vars 44109 1727204245.55290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204245.55335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204245.55360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204245.55385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.55402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.55446: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204245.55455: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.55462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.55571: Set connection var ansible_connection to ssh 44109 1727204245.55584: Set connection var ansible_timeout to 10 44109 1727204245.55593: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204245.55603: Set connection var ansible_pipelining to False 44109 1727204245.55614: Set connection var ansible_shell_executable to /bin/sh 44109 1727204245.55623: Set connection var ansible_shell_type to sh 44109 1727204245.55649: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.55659: variable 'ansible_connection' from source: unknown 44109 1727204245.55666: variable 'ansible_module_compression' from source: unknown 44109 1727204245.55672: variable 'ansible_shell_type' from source: unknown 44109 1727204245.55680: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.55685: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.55690: variable 'ansible_pipelining' from source: unknown 44109 1727204245.55695: variable 'ansible_timeout' from source: unknown 44109 1727204245.55700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.55837: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204245.55854: variable 'omit' from source: magic vars 44109 1727204245.55862: starting attempt loop 44109 1727204245.55873: running the handler 44109 1727204245.56035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204245.56636: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204245.56640: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204245.56825: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204245.56862: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204245.57131: variable 'connection_route_rule6' from source: set_fact 44109 1727204245.57165: Evaluated conditional (connection_route_rule6.stdout is search("priority 30600 to 2001:db8::4/32 table 30600")): True 44109 1727204245.57764: variable 'connection_route_rule6' from source: set_fact 44109 1727204245.57767: Evaluated conditional (connection_route_rule6.stdout is search("priority 30601 not from ::/0 dport 128-256 table 30600") or connection_route_rule6.stdout is search("not priority 30601 from ::/0 dport 128-256 table 30600")): True 44109 1727204245.57963: variable 'connection_route_rule6' from source: set_fact 44109 1727204245.58206: Evaluated conditional (connection_route_rule6.stdout is search("priority 30602 from ::/0 table 30600")): True 44109 1727204245.58215: handler run complete 44109 1727204245.58229: attempt loop complete, returning result 44109 1727204245.58232: _execute() done 44109 1727204245.58235: dumping result to json 44109 1727204245.58237: done dumping result, returning 44109 1727204245.58244: done running TaskExecutor() for managed-node1/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" [028d2410-947f-ed67-a560-000000000067] 44109 1727204245.58247: sending task result for task 028d2410-947f-ed67-a560-000000000067 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204245.58428: no more pending results, returning what we have 44109 1727204245.58432: results queue empty 44109 1727204245.58432: checking for any_errors_fatal 44109 1727204245.58441: done checking for any_errors_fatal 44109 1727204245.58442: checking for max_fail_percentage 44109 1727204245.58444: done checking for max_fail_percentage 44109 1727204245.58445: checking to see if all hosts have failed and the running result is not ok 44109 1727204245.58445: done checking to see if all hosts have failed 44109 1727204245.58446: getting the remaining hosts for this loop 44109 1727204245.58447: done getting the remaining hosts for this loop 44109 1727204245.58451: getting the next task for host managed-node1 44109 1727204245.58458: done getting next task for host managed-node1 44109 1727204245.58461: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 44109 1727204245.58463: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204245.58468: getting variables 44109 1727204245.58470: in VariableManager get_vars() 44109 1727204245.58514: Calling all_inventory to load vars for managed-node1 44109 1727204245.58517: Calling groups_inventory to load vars for managed-node1 44109 1727204245.58519: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204245.58529: Calling all_plugins_play to load vars for managed-node1 44109 1727204245.58531: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204245.58534: Calling groups_plugins_play to load vars for managed-node1 44109 1727204245.59183: done sending task result for task 028d2410-947f-ed67-a560-000000000067 44109 1727204245.59186: WORKER PROCESS EXITING 44109 1727204245.61641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204245.64926: done with get_vars() 44109 1727204245.64959: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:213 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.113) 0:00:22.446 ***** 44109 1727204245.65062: entering _queue_task() for managed-node1/file 44109 1727204245.65914: worker is 1 (out of 1 available) 44109 1727204245.65925: exiting _queue_task() for managed-node1/file 44109 1727204245.65937: done queuing things up, now waiting for results queue to drain 44109 1727204245.65938: waiting for pending results... 44109 1727204245.66227: running TaskExecutor() for managed-node1/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 44109 1727204245.66341: in run() - task 028d2410-947f-ed67-a560-000000000068 44109 1727204245.66361: variable 'ansible_search_path' from source: unknown 44109 1727204245.66404: calling self._execute() 44109 1727204245.66514: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.66526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.66539: variable 'omit' from source: magic vars 44109 1727204245.66922: variable 'ansible_distribution_major_version' from source: facts 44109 1727204245.66938: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204245.66949: variable 'omit' from source: magic vars 44109 1727204245.66972: variable 'omit' from source: magic vars 44109 1727204245.67052: variable 'omit' from source: magic vars 44109 1727204245.67056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204245.67095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204245.67124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204245.67146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.67165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204245.67200: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204245.67207: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.67216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.67311: Set connection var ansible_connection to ssh 44109 1727204245.67378: Set connection var ansible_timeout to 10 44109 1727204245.67381: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204245.67382: Set connection var ansible_pipelining to False 44109 1727204245.67384: Set connection var ansible_shell_executable to /bin/sh 44109 1727204245.67386: Set connection var ansible_shell_type to sh 44109 1727204245.67387: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.67389: variable 'ansible_connection' from source: unknown 44109 1727204245.67390: variable 'ansible_module_compression' from source: unknown 44109 1727204245.67392: variable 'ansible_shell_type' from source: unknown 44109 1727204245.67394: variable 'ansible_shell_executable' from source: unknown 44109 1727204245.67395: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204245.67403: variable 'ansible_pipelining' from source: unknown 44109 1727204245.67410: variable 'ansible_timeout' from source: unknown 44109 1727204245.67580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204245.67639: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204245.67655: variable 'omit' from source: magic vars 44109 1727204245.67664: starting attempt loop 44109 1727204245.67671: running the handler 44109 1727204245.67699: _low_level_execute_command(): starting 44109 1727204245.67715: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204245.68482: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204245.68500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204245.68578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204245.68604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204245.68627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204245.68653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204245.68801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204245.70767: stdout chunk (state=3): >>>/root <<< 44109 1727204245.70771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204245.70773: stdout chunk (state=3): >>><<< 44109 1727204245.70778: stderr chunk (state=3): >>><<< 44109 1727204245.71009: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204245.71015: _low_level_execute_command(): starting 44109 1727204245.71018: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655 `" && echo ansible-tmp-1727204245.7090912-46185-183503875281655="` echo /root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655 `" ) && sleep 0' 44109 1727204245.71688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204245.71741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204245.71823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204245.71852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204245.71879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204245.72064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204245.74161: stdout chunk (state=3): >>>ansible-tmp-1727204245.7090912-46185-183503875281655=/root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655 <<< 44109 1727204245.74259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204245.74289: stdout chunk (state=3): >>><<< 44109 1727204245.74483: stderr chunk (state=3): >>><<< 44109 1727204245.74486: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204245.7090912-46185-183503875281655=/root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204245.74489: variable 'ansible_module_compression' from source: unknown 44109 1727204245.74586: ANSIBALLZ: Using lock for file 44109 1727204245.74597: ANSIBALLZ: Acquiring lock 44109 1727204245.74702: ANSIBALLZ: Lock acquired: 139907468546544 44109 1727204245.74706: ANSIBALLZ: Creating module 44109 1727204245.92933: ANSIBALLZ: Writing module into payload 44109 1727204245.93284: ANSIBALLZ: Writing module 44109 1727204245.93342: ANSIBALLZ: Renaming module 44109 1727204245.93401: ANSIBALLZ: Done creating module 44109 1727204245.93431: variable 'ansible_facts' from source: unknown 44109 1727204245.93644: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/AnsiballZ_file.py 44109 1727204245.94016: Sending initial data 44109 1727204245.94027: Sent initial data (153 bytes) 44109 1727204245.95181: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204245.95261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204245.95267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204245.95583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204245.95696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204245.97497: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204245.97569: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204245.97692: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpgdrumd7f /root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/AnsiballZ_file.py <<< 44109 1727204245.97696: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/AnsiballZ_file.py" <<< 44109 1727204245.97786: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpgdrumd7f" to remote "/root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/AnsiballZ_file.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/AnsiballZ_file.py" <<< 44109 1727204245.99429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204245.99433: stdout chunk (state=3): >>><<< 44109 1727204245.99462: stderr chunk (state=3): >>><<< 44109 1727204245.99506: done transferring module to remote 44109 1727204245.99518: _low_level_execute_command(): starting 44109 1727204245.99570: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/ /root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/AnsiballZ_file.py && sleep 0' 44109 1727204246.00995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204246.00999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.01127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204246.01164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204246.01334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204246.03318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204246.03418: stderr chunk (state=3): >>><<< 44109 1727204246.03421: stdout chunk (state=3): >>><<< 44109 1727204246.03482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204246.03485: _low_level_execute_command(): starting 44109 1727204246.03488: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/AnsiballZ_file.py && sleep 0' 44109 1727204246.04096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204246.04156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204246.04159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204246.04162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204246.04164: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204246.04166: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204246.04169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.04171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204246.04173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204246.04181: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204246.04190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204246.04200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204246.04214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204246.04265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204246.04268: stderr chunk (state=3): >>>debug2: match found <<< 44109 1727204246.04270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.04320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204246.04323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204246.04336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204246.04455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204246.21822: stdout chunk (state=3): >>> {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 44109 1727204246.23533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204246.23554: stderr chunk (state=3): >>><<< 44109 1727204246.23557: stdout chunk (state=3): >>><<< 44109 1727204246.23578: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204246.23608: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204246.23624: _low_level_execute_command(): starting 44109 1727204246.23628: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204245.7090912-46185-183503875281655/ > /dev/null 2>&1 && sleep 0' 44109 1727204246.24067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204246.24079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.24099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204246.24102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.24161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204246.24164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204246.24166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204246.24250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204246.26210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204246.26241: stderr chunk (state=3): >>><<< 44109 1727204246.26244: stdout chunk (state=3): >>><<< 44109 1727204246.26259: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204246.26265: handler run complete 44109 1727204246.26284: attempt loop complete, returning result 44109 1727204246.26287: _execute() done 44109 1727204246.26289: dumping result to json 44109 1727204246.26293: done dumping result, returning 44109 1727204246.26301: done running TaskExecutor() for managed-node1/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [028d2410-947f-ed67-a560-000000000068] 44109 1727204246.26305: sending task result for task 028d2410-947f-ed67-a560-000000000068 44109 1727204246.26406: done sending task result for task 028d2410-947f-ed67-a560-000000000068 44109 1727204246.26408: WORKER PROCESS EXITING changed: [managed-node1] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 44109 1727204246.26500: no more pending results, returning what we have 44109 1727204246.26503: results queue empty 44109 1727204246.26504: checking for any_errors_fatal 44109 1727204246.26514: done checking for any_errors_fatal 44109 1727204246.26515: checking for max_fail_percentage 44109 1727204246.26516: done checking for max_fail_percentage 44109 1727204246.26517: checking to see if all hosts have failed and the running result is not ok 44109 1727204246.26518: done checking to see if all hosts have failed 44109 1727204246.26519: getting the remaining hosts for this loop 44109 1727204246.26520: done getting the remaining hosts for this loop 44109 1727204246.26523: getting the next task for host managed-node1 44109 1727204246.26529: done getting next task for host managed-node1 44109 1727204246.26531: ^ task is: TASK: meta (flush_handlers) 44109 1727204246.26533: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204246.26537: getting variables 44109 1727204246.26539: in VariableManager get_vars() 44109 1727204246.26579: Calling all_inventory to load vars for managed-node1 44109 1727204246.26581: Calling groups_inventory to load vars for managed-node1 44109 1727204246.26584: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204246.26594: Calling all_plugins_play to load vars for managed-node1 44109 1727204246.26597: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204246.26599: Calling groups_plugins_play to load vars for managed-node1 44109 1727204246.27429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204246.28288: done with get_vars() 44109 1727204246.28308: done getting variables 44109 1727204246.28359: in VariableManager get_vars() 44109 1727204246.28368: Calling all_inventory to load vars for managed-node1 44109 1727204246.28370: Calling groups_inventory to load vars for managed-node1 44109 1727204246.28371: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204246.28374: Calling all_plugins_play to load vars for managed-node1 44109 1727204246.28377: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204246.28379: Calling groups_plugins_play to load vars for managed-node1 44109 1727204246.29096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204246.29947: done with get_vars() 44109 1727204246.29968: done queuing things up, now waiting for results queue to drain 44109 1727204246.29970: results queue empty 44109 1727204246.29970: checking for any_errors_fatal 44109 1727204246.29972: done checking for any_errors_fatal 44109 1727204246.29973: checking for max_fail_percentage 44109 1727204246.29974: done checking for max_fail_percentage 44109 1727204246.29974: checking to see if all hosts have failed and the running result is not ok 44109 1727204246.29975: done checking to see if all hosts have failed 44109 1727204246.29977: getting the remaining hosts for this loop 44109 1727204246.29978: done getting the remaining hosts for this loop 44109 1727204246.29980: getting the next task for host managed-node1 44109 1727204246.29983: done getting next task for host managed-node1 44109 1727204246.29983: ^ task is: TASK: meta (flush_handlers) 44109 1727204246.29985: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204246.29986: getting variables 44109 1727204246.29987: in VariableManager get_vars() 44109 1727204246.29995: Calling all_inventory to load vars for managed-node1 44109 1727204246.29996: Calling groups_inventory to load vars for managed-node1 44109 1727204246.29997: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204246.30002: Calling all_plugins_play to load vars for managed-node1 44109 1727204246.30003: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204246.30005: Calling groups_plugins_play to load vars for managed-node1 44109 1727204246.30656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204246.31498: done with get_vars() 44109 1727204246.31519: done getting variables 44109 1727204246.31558: in VariableManager get_vars() 44109 1727204246.31568: Calling all_inventory to load vars for managed-node1 44109 1727204246.31569: Calling groups_inventory to load vars for managed-node1 44109 1727204246.31571: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204246.31574: Calling all_plugins_play to load vars for managed-node1 44109 1727204246.31577: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204246.31579: Calling groups_plugins_play to load vars for managed-node1 44109 1727204246.32242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204246.33092: done with get_vars() 44109 1727204246.33116: done queuing things up, now waiting for results queue to drain 44109 1727204246.33118: results queue empty 44109 1727204246.33118: checking for any_errors_fatal 44109 1727204246.33119: done checking for any_errors_fatal 44109 1727204246.33120: checking for max_fail_percentage 44109 1727204246.33121: done checking for max_fail_percentage 44109 1727204246.33121: checking to see if all hosts have failed and the running result is not ok 44109 1727204246.33122: done checking to see if all hosts have failed 44109 1727204246.33122: getting the remaining hosts for this loop 44109 1727204246.33123: done getting the remaining hosts for this loop 44109 1727204246.33125: getting the next task for host managed-node1 44109 1727204246.33127: done getting next task for host managed-node1 44109 1727204246.33128: ^ task is: None 44109 1727204246.33129: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204246.33129: done queuing things up, now waiting for results queue to drain 44109 1727204246.33130: results queue empty 44109 1727204246.33130: checking for any_errors_fatal 44109 1727204246.33131: done checking for any_errors_fatal 44109 1727204246.33131: checking for max_fail_percentage 44109 1727204246.33132: done checking for max_fail_percentage 44109 1727204246.33132: checking to see if all hosts have failed and the running result is not ok 44109 1727204246.33133: done checking to see if all hosts have failed 44109 1727204246.33134: getting the next task for host managed-node1 44109 1727204246.33135: done getting next task for host managed-node1 44109 1727204246.33136: ^ task is: None 44109 1727204246.33136: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204246.33189: in VariableManager get_vars() 44109 1727204246.33205: done with get_vars() 44109 1727204246.33209: in VariableManager get_vars() 44109 1727204246.33219: done with get_vars() 44109 1727204246.33221: variable 'omit' from source: magic vars 44109 1727204246.33309: variable 'profile' from source: play vars 44109 1727204246.33391: in VariableManager get_vars() 44109 1727204246.33403: done with get_vars() 44109 1727204246.33420: variable 'omit' from source: magic vars 44109 1727204246.33462: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 44109 1727204246.33907: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44109 1727204246.33932: getting the remaining hosts for this loop 44109 1727204246.33933: done getting the remaining hosts for this loop 44109 1727204246.33935: getting the next task for host managed-node1 44109 1727204246.33936: done getting next task for host managed-node1 44109 1727204246.33938: ^ task is: TASK: Gathering Facts 44109 1727204246.33940: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204246.33942: getting variables 44109 1727204246.33943: in VariableManager get_vars() 44109 1727204246.33951: Calling all_inventory to load vars for managed-node1 44109 1727204246.33953: Calling groups_inventory to load vars for managed-node1 44109 1727204246.33954: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204246.33958: Calling all_plugins_play to load vars for managed-node1 44109 1727204246.33960: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204246.33961: Calling groups_plugins_play to load vars for managed-node1 44109 1727204246.34631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204246.35532: done with get_vars() 44109 1727204246.35546: done getting variables 44109 1727204246.35583: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.705) 0:00:23.152 ***** 44109 1727204246.35602: entering _queue_task() for managed-node1/gather_facts 44109 1727204246.35857: worker is 1 (out of 1 available) 44109 1727204246.35869: exiting _queue_task() for managed-node1/gather_facts 44109 1727204246.35882: done queuing things up, now waiting for results queue to drain 44109 1727204246.35884: waiting for pending results... 44109 1727204246.36061: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44109 1727204246.36127: in run() - task 028d2410-947f-ed67-a560-0000000004b1 44109 1727204246.36138: variable 'ansible_search_path' from source: unknown 44109 1727204246.36167: calling self._execute() 44109 1727204246.36246: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204246.36250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204246.36258: variable 'omit' from source: magic vars 44109 1727204246.36538: variable 'ansible_distribution_major_version' from source: facts 44109 1727204246.36547: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204246.36559: variable 'omit' from source: magic vars 44109 1727204246.36581: variable 'omit' from source: magic vars 44109 1727204246.36607: variable 'omit' from source: magic vars 44109 1727204246.36643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204246.36677: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204246.36692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204246.36708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204246.36721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204246.36744: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204246.36747: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204246.36750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204246.36824: Set connection var ansible_connection to ssh 44109 1727204246.36828: Set connection var ansible_timeout to 10 44109 1727204246.36833: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204246.36840: Set connection var ansible_pipelining to False 44109 1727204246.36845: Set connection var ansible_shell_executable to /bin/sh 44109 1727204246.36849: Set connection var ansible_shell_type to sh 44109 1727204246.36868: variable 'ansible_shell_executable' from source: unknown 44109 1727204246.36871: variable 'ansible_connection' from source: unknown 44109 1727204246.36873: variable 'ansible_module_compression' from source: unknown 44109 1727204246.36877: variable 'ansible_shell_type' from source: unknown 44109 1727204246.36879: variable 'ansible_shell_executable' from source: unknown 44109 1727204246.36881: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204246.36884: variable 'ansible_pipelining' from source: unknown 44109 1727204246.36886: variable 'ansible_timeout' from source: unknown 44109 1727204246.36895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204246.37037: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204246.37045: variable 'omit' from source: magic vars 44109 1727204246.37048: starting attempt loop 44109 1727204246.37051: running the handler 44109 1727204246.37065: variable 'ansible_facts' from source: unknown 44109 1727204246.37084: _low_level_execute_command(): starting 44109 1727204246.37090: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204246.37615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204246.37619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.37623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204246.37625: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.37668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204246.37671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204246.37673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204246.37764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204246.39533: stdout chunk (state=3): >>>/root <<< 44109 1727204246.39625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204246.39656: stderr chunk (state=3): >>><<< 44109 1727204246.39660: stdout chunk (state=3): >>><<< 44109 1727204246.39682: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204246.39693: _low_level_execute_command(): starting 44109 1727204246.39700: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007 `" && echo ansible-tmp-1727204246.3968062-46219-249793691415007="` echo /root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007 `" ) && sleep 0' 44109 1727204246.40139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204246.40143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.40152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204246.40155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.40206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204246.40213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204246.40216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204246.40293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204246.42412: stdout chunk (state=3): >>>ansible-tmp-1727204246.3968062-46219-249793691415007=/root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007 <<< 44109 1727204246.42522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204246.42549: stderr chunk (state=3): >>><<< 44109 1727204246.42553: stdout chunk (state=3): >>><<< 44109 1727204246.42568: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204246.3968062-46219-249793691415007=/root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204246.42603: variable 'ansible_module_compression' from source: unknown 44109 1727204246.42643: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44109 1727204246.42696: variable 'ansible_facts' from source: unknown 44109 1727204246.42827: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/AnsiballZ_setup.py 44109 1727204246.42931: Sending initial data 44109 1727204246.42934: Sent initial data (154 bytes) 44109 1727204246.43389: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204246.43392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204246.43395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.43398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204246.43400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.43452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204246.43458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204246.43462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204246.43536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204246.45280: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 44109 1727204246.45288: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204246.45350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204246.45426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpinam6e8z /root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/AnsiballZ_setup.py <<< 44109 1727204246.45429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/AnsiballZ_setup.py" <<< 44109 1727204246.45506: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpinam6e8z" to remote "/root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/AnsiballZ_setup.py" <<< 44109 1727204246.45510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/AnsiballZ_setup.py" <<< 44109 1727204246.46728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204246.46767: stderr chunk (state=3): >>><<< 44109 1727204246.46771: stdout chunk (state=3): >>><<< 44109 1727204246.46789: done transferring module to remote 44109 1727204246.46798: _low_level_execute_command(): starting 44109 1727204246.46807: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/ /root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/AnsiballZ_setup.py && sleep 0' 44109 1727204246.47251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204246.47254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204246.47256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.47258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204246.47264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.47316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204246.47319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204246.47324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204246.47404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204246.49357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204246.49384: stderr chunk (state=3): >>><<< 44109 1727204246.49387: stdout chunk (state=3): >>><<< 44109 1727204246.49400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204246.49403: _low_level_execute_command(): starting 44109 1727204246.49408: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/AnsiballZ_setup.py && sleep 0' 44109 1727204246.49838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204246.49841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.49843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204246.49845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204246.49901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204246.49905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204246.49910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204246.49993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204247.20005: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.546875, "5m": 0.52587890625, "15m": 0.30908203125}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key<<< 44109 1727204247.20060: stdout chunk (state=3): >>>_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "26", "epoch": "1727204246", "epoch_int": "1727204246", "date": "2024-09-24", "time": "14:57:26", "iso8601_micro": "2024-09-24T18:57:26.795267Z", "iso8601": "2024-09-24T18:57:26Z", "iso8601_basic": "20240924T145726795267", "iso8601_basic_short": "20240924T145726", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2921, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 610, "free": 2921}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 838, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261781057536, "block_size": 4096, "block_total": 65519099, "block_available": 63911391, "block_used": 1607708, "inode_total": 131070960, "inode_available": 131027257, "inode_used": 43703, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["ethtest0", "eth0", "peerethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:70:7d:ce:1e:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f470:7dff:fece:1e9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "96:5a:e0:79:e4:16", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::945a:e0ff:fe79:e416", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::f470:7dff:fece:1e9f", "2001:db8::2", "fe80::945a:e0ff:fe79:e416"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::8ff:ddff:fe89:9be5", "fe80::945a:e0ff:fe79:e416", "fe80::f470:7dff:fece:1e9f"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44109 1727204247.22374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204247.22401: stderr chunk (state=3): >>><<< 44109 1727204247.22404: stdout chunk (state=3): >>><<< 44109 1727204247.22441: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.546875, "5m": 0.52587890625, "15m": 0.30908203125}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "26", "epoch": "1727204246", "epoch_int": "1727204246", "date": "2024-09-24", "time": "14:57:26", "iso8601_micro": "2024-09-24T18:57:26.795267Z", "iso8601": "2024-09-24T18:57:26Z", "iso8601_basic": "20240924T145726795267", "iso8601_basic_short": "20240924T145726", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2921, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 610, "free": 2921}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 838, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261781057536, "block_size": 4096, "block_total": 65519099, "block_available": 63911391, "block_used": 1607708, "inode_total": 131070960, "inode_available": 131027257, "inode_used": 43703, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["ethtest0", "eth0", "peerethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:70:7d:ce:1e:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f470:7dff:fece:1e9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "96:5a:e0:79:e4:16", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::945a:e0ff:fe79:e416", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::f470:7dff:fece:1e9f", "2001:db8::2", "fe80::945a:e0ff:fe79:e416"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::8ff:ddff:fe89:9be5", "fe80::945a:e0ff:fe79:e416", "fe80::f470:7dff:fece:1e9f"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204247.22722: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204247.22740: _low_level_execute_command(): starting 44109 1727204247.22748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204246.3968062-46219-249793691415007/ > /dev/null 2>&1 && sleep 0' 44109 1727204247.23189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204247.23193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.23195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204247.23197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.23252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204247.23255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204247.23258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204247.23338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204247.25296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204247.25323: stderr chunk (state=3): >>><<< 44109 1727204247.25326: stdout chunk (state=3): >>><<< 44109 1727204247.25341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204247.25353: handler run complete 44109 1727204247.25442: variable 'ansible_facts' from source: unknown 44109 1727204247.25521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.25732: variable 'ansible_facts' from source: unknown 44109 1727204247.25809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.25901: attempt loop complete, returning result 44109 1727204247.25905: _execute() done 44109 1727204247.25907: dumping result to json 44109 1727204247.25929: done dumping result, returning 44109 1727204247.25936: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-ed67-a560-0000000004b1] 44109 1727204247.25939: sending task result for task 028d2410-947f-ed67-a560-0000000004b1 ok: [managed-node1] 44109 1727204247.26542: no more pending results, returning what we have 44109 1727204247.26544: results queue empty 44109 1727204247.26545: checking for any_errors_fatal 44109 1727204247.26546: done checking for any_errors_fatal 44109 1727204247.26546: checking for max_fail_percentage 44109 1727204247.26547: done checking for max_fail_percentage 44109 1727204247.26548: checking to see if all hosts have failed and the running result is not ok 44109 1727204247.26548: done checking to see if all hosts have failed 44109 1727204247.26549: getting the remaining hosts for this loop 44109 1727204247.26550: done getting the remaining hosts for this loop 44109 1727204247.26552: getting the next task for host managed-node1 44109 1727204247.26555: done getting next task for host managed-node1 44109 1727204247.26557: ^ task is: TASK: meta (flush_handlers) 44109 1727204247.26558: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204247.26561: getting variables 44109 1727204247.26562: in VariableManager get_vars() 44109 1727204247.26585: Calling all_inventory to load vars for managed-node1 44109 1727204247.26586: Calling groups_inventory to load vars for managed-node1 44109 1727204247.26588: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204247.26597: Calling all_plugins_play to load vars for managed-node1 44109 1727204247.26598: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204247.26602: Calling groups_plugins_play to load vars for managed-node1 44109 1727204247.27121: done sending task result for task 028d2410-947f-ed67-a560-0000000004b1 44109 1727204247.27125: WORKER PROCESS EXITING 44109 1727204247.27367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.28329: done with get_vars() 44109 1727204247.28347: done getting variables 44109 1727204247.28400: in VariableManager get_vars() 44109 1727204247.28408: Calling all_inventory to load vars for managed-node1 44109 1727204247.28410: Calling groups_inventory to load vars for managed-node1 44109 1727204247.28413: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204247.28417: Calling all_plugins_play to load vars for managed-node1 44109 1727204247.28418: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204247.28420: Calling groups_plugins_play to load vars for managed-node1 44109 1727204247.29061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.29938: done with get_vars() 44109 1727204247.29960: done queuing things up, now waiting for results queue to drain 44109 1727204247.29963: results queue empty 44109 1727204247.29963: checking for any_errors_fatal 44109 1727204247.29966: done checking for any_errors_fatal 44109 1727204247.29967: checking for max_fail_percentage 44109 1727204247.29968: done checking for max_fail_percentage 44109 1727204247.29973: checking to see if all hosts have failed and the running result is not ok 44109 1727204247.29973: done checking to see if all hosts have failed 44109 1727204247.29974: getting the remaining hosts for this loop 44109 1727204247.29974: done getting the remaining hosts for this loop 44109 1727204247.29978: getting the next task for host managed-node1 44109 1727204247.29981: done getting next task for host managed-node1 44109 1727204247.29983: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44109 1727204247.29984: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204247.29992: getting variables 44109 1727204247.29992: in VariableManager get_vars() 44109 1727204247.30002: Calling all_inventory to load vars for managed-node1 44109 1727204247.30004: Calling groups_inventory to load vars for managed-node1 44109 1727204247.30005: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204247.30008: Calling all_plugins_play to load vars for managed-node1 44109 1727204247.30010: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204247.30013: Calling groups_plugins_play to load vars for managed-node1 44109 1727204247.30698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.31561: done with get_vars() 44109 1727204247.31581: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.960) 0:00:24.112 ***** 44109 1727204247.31641: entering _queue_task() for managed-node1/include_tasks 44109 1727204247.31918: worker is 1 (out of 1 available) 44109 1727204247.31933: exiting _queue_task() for managed-node1/include_tasks 44109 1727204247.31943: done queuing things up, now waiting for results queue to drain 44109 1727204247.31944: waiting for pending results... 44109 1727204247.32130: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44109 1727204247.32201: in run() - task 028d2410-947f-ed67-a560-000000000071 44109 1727204247.32217: variable 'ansible_search_path' from source: unknown 44109 1727204247.32221: variable 'ansible_search_path' from source: unknown 44109 1727204247.32247: calling self._execute() 44109 1727204247.32324: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204247.32329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204247.32336: variable 'omit' from source: magic vars 44109 1727204247.32619: variable 'ansible_distribution_major_version' from source: facts 44109 1727204247.32628: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204247.32634: _execute() done 44109 1727204247.32637: dumping result to json 44109 1727204247.32639: done dumping result, returning 44109 1727204247.32646: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-ed67-a560-000000000071] 44109 1727204247.32651: sending task result for task 028d2410-947f-ed67-a560-000000000071 44109 1727204247.32742: done sending task result for task 028d2410-947f-ed67-a560-000000000071 44109 1727204247.32745: WORKER PROCESS EXITING 44109 1727204247.32786: no more pending results, returning what we have 44109 1727204247.32792: in VariableManager get_vars() 44109 1727204247.32835: Calling all_inventory to load vars for managed-node1 44109 1727204247.32839: Calling groups_inventory to load vars for managed-node1 44109 1727204247.32841: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204247.32852: Calling all_plugins_play to load vars for managed-node1 44109 1727204247.32862: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204247.32865: Calling groups_plugins_play to load vars for managed-node1 44109 1727204247.33685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.34563: done with get_vars() 44109 1727204247.34580: variable 'ansible_search_path' from source: unknown 44109 1727204247.34581: variable 'ansible_search_path' from source: unknown 44109 1727204247.34604: we have included files to process 44109 1727204247.34605: generating all_blocks data 44109 1727204247.34606: done generating all_blocks data 44109 1727204247.34607: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204247.34607: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204247.34609: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204247.34992: done processing included file 44109 1727204247.34994: iterating over new_blocks loaded from include file 44109 1727204247.34995: in VariableManager get_vars() 44109 1727204247.35009: done with get_vars() 44109 1727204247.35010: filtering new block on tags 44109 1727204247.35023: done filtering new block on tags 44109 1727204247.35026: in VariableManager get_vars() 44109 1727204247.35038: done with get_vars() 44109 1727204247.35039: filtering new block on tags 44109 1727204247.35051: done filtering new block on tags 44109 1727204247.35053: in VariableManager get_vars() 44109 1727204247.35064: done with get_vars() 44109 1727204247.35065: filtering new block on tags 44109 1727204247.35074: done filtering new block on tags 44109 1727204247.35077: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 44109 1727204247.35081: extending task lists for all hosts with included blocks 44109 1727204247.35291: done extending task lists 44109 1727204247.35292: done processing included files 44109 1727204247.35293: results queue empty 44109 1727204247.35293: checking for any_errors_fatal 44109 1727204247.35294: done checking for any_errors_fatal 44109 1727204247.35295: checking for max_fail_percentage 44109 1727204247.35295: done checking for max_fail_percentage 44109 1727204247.35296: checking to see if all hosts have failed and the running result is not ok 44109 1727204247.35296: done checking to see if all hosts have failed 44109 1727204247.35297: getting the remaining hosts for this loop 44109 1727204247.35298: done getting the remaining hosts for this loop 44109 1727204247.35299: getting the next task for host managed-node1 44109 1727204247.35302: done getting next task for host managed-node1 44109 1727204247.35304: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44109 1727204247.35306: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204247.35314: getting variables 44109 1727204247.35315: in VariableManager get_vars() 44109 1727204247.35324: Calling all_inventory to load vars for managed-node1 44109 1727204247.35326: Calling groups_inventory to load vars for managed-node1 44109 1727204247.35327: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204247.35330: Calling all_plugins_play to load vars for managed-node1 44109 1727204247.35332: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204247.35334: Calling groups_plugins_play to load vars for managed-node1 44109 1727204247.39424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.40300: done with get_vars() 44109 1727204247.40323: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.087) 0:00:24.200 ***** 44109 1727204247.40382: entering _queue_task() for managed-node1/setup 44109 1727204247.40660: worker is 1 (out of 1 available) 44109 1727204247.40672: exiting _queue_task() for managed-node1/setup 44109 1727204247.40686: done queuing things up, now waiting for results queue to drain 44109 1727204247.40687: waiting for pending results... 44109 1727204247.40866: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44109 1727204247.40960: in run() - task 028d2410-947f-ed67-a560-0000000004f2 44109 1727204247.40972: variable 'ansible_search_path' from source: unknown 44109 1727204247.40977: variable 'ansible_search_path' from source: unknown 44109 1727204247.41007: calling self._execute() 44109 1727204247.41088: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204247.41092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204247.41100: variable 'omit' from source: magic vars 44109 1727204247.41391: variable 'ansible_distribution_major_version' from source: facts 44109 1727204247.41399: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204247.41546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204247.43007: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204247.43060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204247.43095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204247.43119: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204247.43141: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204247.43205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204247.43225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204247.43243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204247.43269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204247.43283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204247.43325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204247.43341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204247.43358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204247.43385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204247.43395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204247.43503: variable '__network_required_facts' from source: role '' defaults 44109 1727204247.43514: variable 'ansible_facts' from source: unknown 44109 1727204247.43954: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44109 1727204247.43960: when evaluation is False, skipping this task 44109 1727204247.43963: _execute() done 44109 1727204247.43965: dumping result to json 44109 1727204247.43968: done dumping result, returning 44109 1727204247.43970: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-ed67-a560-0000000004f2] 44109 1727204247.43982: sending task result for task 028d2410-947f-ed67-a560-0000000004f2 44109 1727204247.44066: done sending task result for task 028d2410-947f-ed67-a560-0000000004f2 44109 1727204247.44068: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204247.44124: no more pending results, returning what we have 44109 1727204247.44128: results queue empty 44109 1727204247.44129: checking for any_errors_fatal 44109 1727204247.44130: done checking for any_errors_fatal 44109 1727204247.44131: checking for max_fail_percentage 44109 1727204247.44133: done checking for max_fail_percentage 44109 1727204247.44134: checking to see if all hosts have failed and the running result is not ok 44109 1727204247.44135: done checking to see if all hosts have failed 44109 1727204247.44135: getting the remaining hosts for this loop 44109 1727204247.44137: done getting the remaining hosts for this loop 44109 1727204247.44140: getting the next task for host managed-node1 44109 1727204247.44150: done getting next task for host managed-node1 44109 1727204247.44153: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44109 1727204247.44156: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204247.44169: getting variables 44109 1727204247.44171: in VariableManager get_vars() 44109 1727204247.44210: Calling all_inventory to load vars for managed-node1 44109 1727204247.44216: Calling groups_inventory to load vars for managed-node1 44109 1727204247.44218: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204247.44227: Calling all_plugins_play to load vars for managed-node1 44109 1727204247.44229: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204247.44232: Calling groups_plugins_play to load vars for managed-node1 44109 1727204247.45049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.45948: done with get_vars() 44109 1727204247.45965: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.056) 0:00:24.256 ***** 44109 1727204247.46040: entering _queue_task() for managed-node1/stat 44109 1727204247.46299: worker is 1 (out of 1 available) 44109 1727204247.46316: exiting _queue_task() for managed-node1/stat 44109 1727204247.46331: done queuing things up, now waiting for results queue to drain 44109 1727204247.46332: waiting for pending results... 44109 1727204247.46519: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 44109 1727204247.46621: in run() - task 028d2410-947f-ed67-a560-0000000004f4 44109 1727204247.46638: variable 'ansible_search_path' from source: unknown 44109 1727204247.46642: variable 'ansible_search_path' from source: unknown 44109 1727204247.46674: calling self._execute() 44109 1727204247.46749: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204247.46754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204247.46761: variable 'omit' from source: magic vars 44109 1727204247.47047: variable 'ansible_distribution_major_version' from source: facts 44109 1727204247.47057: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204247.47171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204247.47364: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204247.47397: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204247.47680: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204247.47709: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204247.47774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204247.47794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204247.47814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204247.47831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204247.47898: variable '__network_is_ostree' from source: set_fact 44109 1727204247.47905: Evaluated conditional (not __network_is_ostree is defined): False 44109 1727204247.47908: when evaluation is False, skipping this task 44109 1727204247.47913: _execute() done 44109 1727204247.47916: dumping result to json 44109 1727204247.47919: done dumping result, returning 44109 1727204247.47922: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-ed67-a560-0000000004f4] 44109 1727204247.47927: sending task result for task 028d2410-947f-ed67-a560-0000000004f4 44109 1727204247.48014: done sending task result for task 028d2410-947f-ed67-a560-0000000004f4 44109 1727204247.48017: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44109 1727204247.48064: no more pending results, returning what we have 44109 1727204247.48067: results queue empty 44109 1727204247.48068: checking for any_errors_fatal 44109 1727204247.48073: done checking for any_errors_fatal 44109 1727204247.48074: checking for max_fail_percentage 44109 1727204247.48077: done checking for max_fail_percentage 44109 1727204247.48078: checking to see if all hosts have failed and the running result is not ok 44109 1727204247.48079: done checking to see if all hosts have failed 44109 1727204247.48080: getting the remaining hosts for this loop 44109 1727204247.48081: done getting the remaining hosts for this loop 44109 1727204247.48084: getting the next task for host managed-node1 44109 1727204247.48090: done getting next task for host managed-node1 44109 1727204247.48094: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44109 1727204247.48096: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204247.48109: getting variables 44109 1727204247.48111: in VariableManager get_vars() 44109 1727204247.48150: Calling all_inventory to load vars for managed-node1 44109 1727204247.48153: Calling groups_inventory to load vars for managed-node1 44109 1727204247.48155: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204247.48164: Calling all_plugins_play to load vars for managed-node1 44109 1727204247.48167: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204247.48170: Calling groups_plugins_play to load vars for managed-node1 44109 1727204247.49088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.49974: done with get_vars() 44109 1727204247.49993: done getting variables 44109 1727204247.50042: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.040) 0:00:24.296 ***** 44109 1727204247.50069: entering _queue_task() for managed-node1/set_fact 44109 1727204247.50336: worker is 1 (out of 1 available) 44109 1727204247.50348: exiting _queue_task() for managed-node1/set_fact 44109 1727204247.50360: done queuing things up, now waiting for results queue to drain 44109 1727204247.50361: waiting for pending results... 44109 1727204247.50545: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44109 1727204247.50651: in run() - task 028d2410-947f-ed67-a560-0000000004f5 44109 1727204247.50663: variable 'ansible_search_path' from source: unknown 44109 1727204247.50667: variable 'ansible_search_path' from source: unknown 44109 1727204247.50703: calling self._execute() 44109 1727204247.50770: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204247.50774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204247.50785: variable 'omit' from source: magic vars 44109 1727204247.51061: variable 'ansible_distribution_major_version' from source: facts 44109 1727204247.51070: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204247.51185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204247.51374: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204247.51409: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204247.51436: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204247.51495: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204247.51557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204247.51579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204247.51598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204247.51617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204247.51679: variable '__network_is_ostree' from source: set_fact 44109 1727204247.51693: Evaluated conditional (not __network_is_ostree is defined): False 44109 1727204247.51697: when evaluation is False, skipping this task 44109 1727204247.51699: _execute() done 44109 1727204247.51702: dumping result to json 44109 1727204247.51704: done dumping result, returning 44109 1727204247.51711: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-ed67-a560-0000000004f5] 44109 1727204247.51719: sending task result for task 028d2410-947f-ed67-a560-0000000004f5 44109 1727204247.51800: done sending task result for task 028d2410-947f-ed67-a560-0000000004f5 44109 1727204247.51803: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44109 1727204247.51847: no more pending results, returning what we have 44109 1727204247.51851: results queue empty 44109 1727204247.51852: checking for any_errors_fatal 44109 1727204247.51858: done checking for any_errors_fatal 44109 1727204247.51859: checking for max_fail_percentage 44109 1727204247.51860: done checking for max_fail_percentage 44109 1727204247.51861: checking to see if all hosts have failed and the running result is not ok 44109 1727204247.51862: done checking to see if all hosts have failed 44109 1727204247.51863: getting the remaining hosts for this loop 44109 1727204247.51864: done getting the remaining hosts for this loop 44109 1727204247.51868: getting the next task for host managed-node1 44109 1727204247.51878: done getting next task for host managed-node1 44109 1727204247.51882: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44109 1727204247.51884: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204247.51898: getting variables 44109 1727204247.51900: in VariableManager get_vars() 44109 1727204247.51936: Calling all_inventory to load vars for managed-node1 44109 1727204247.51939: Calling groups_inventory to load vars for managed-node1 44109 1727204247.51942: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204247.51951: Calling all_plugins_play to load vars for managed-node1 44109 1727204247.51953: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204247.51955: Calling groups_plugins_play to load vars for managed-node1 44109 1727204247.52755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204247.53741: done with get_vars() 44109 1727204247.53758: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.037) 0:00:24.334 ***** 44109 1727204247.53831: entering _queue_task() for managed-node1/service_facts 44109 1727204247.54086: worker is 1 (out of 1 available) 44109 1727204247.54099: exiting _queue_task() for managed-node1/service_facts 44109 1727204247.54112: done queuing things up, now waiting for results queue to drain 44109 1727204247.54113: waiting for pending results... 44109 1727204247.54303: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 44109 1727204247.54402: in run() - task 028d2410-947f-ed67-a560-0000000004f7 44109 1727204247.54415: variable 'ansible_search_path' from source: unknown 44109 1727204247.54418: variable 'ansible_search_path' from source: unknown 44109 1727204247.54449: calling self._execute() 44109 1727204247.54531: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204247.54535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204247.54544: variable 'omit' from source: magic vars 44109 1727204247.54828: variable 'ansible_distribution_major_version' from source: facts 44109 1727204247.54837: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204247.54843: variable 'omit' from source: magic vars 44109 1727204247.54981: variable 'omit' from source: magic vars 44109 1727204247.54986: variable 'omit' from source: magic vars 44109 1727204247.54989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204247.54993: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204247.54996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204247.54998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204247.55009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204247.55031: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204247.55034: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204247.55038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204247.55110: Set connection var ansible_connection to ssh 44109 1727204247.55117: Set connection var ansible_timeout to 10 44109 1727204247.55119: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204247.55129: Set connection var ansible_pipelining to False 44109 1727204247.55131: Set connection var ansible_shell_executable to /bin/sh 44109 1727204247.55134: Set connection var ansible_shell_type to sh 44109 1727204247.55151: variable 'ansible_shell_executable' from source: unknown 44109 1727204247.55154: variable 'ansible_connection' from source: unknown 44109 1727204247.55157: variable 'ansible_module_compression' from source: unknown 44109 1727204247.55159: variable 'ansible_shell_type' from source: unknown 44109 1727204247.55161: variable 'ansible_shell_executable' from source: unknown 44109 1727204247.55163: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204247.55168: variable 'ansible_pipelining' from source: unknown 44109 1727204247.55170: variable 'ansible_timeout' from source: unknown 44109 1727204247.55174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204247.55350: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204247.55355: variable 'omit' from source: magic vars 44109 1727204247.55357: starting attempt loop 44109 1727204247.55359: running the handler 44109 1727204247.55361: _low_level_execute_command(): starting 44109 1727204247.55364: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204247.55867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204247.55872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204247.55878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.55928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204247.55932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204247.55942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204247.56035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204247.57833: stdout chunk (state=3): >>>/root <<< 44109 1727204247.57925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204247.57958: stderr chunk (state=3): >>><<< 44109 1727204247.57962: stdout chunk (state=3): >>><<< 44109 1727204247.57984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204247.57997: _low_level_execute_command(): starting 44109 1727204247.58003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616 `" && echo ansible-tmp-1727204247.5798504-46238-99815106032616="` echo /root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616 `" ) && sleep 0' 44109 1727204247.58466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204247.58471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.58477: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204247.58488: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204247.58490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.58531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204247.58535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204247.58625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204247.60749: stdout chunk (state=3): >>>ansible-tmp-1727204247.5798504-46238-99815106032616=/root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616 <<< 44109 1727204247.60857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204247.60892: stderr chunk (state=3): >>><<< 44109 1727204247.60896: stdout chunk (state=3): >>><<< 44109 1727204247.60910: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204247.5798504-46238-99815106032616=/root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204247.60951: variable 'ansible_module_compression' from source: unknown 44109 1727204247.60993: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44109 1727204247.61028: variable 'ansible_facts' from source: unknown 44109 1727204247.61087: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/AnsiballZ_service_facts.py 44109 1727204247.61197: Sending initial data 44109 1727204247.61208: Sent initial data (161 bytes) 44109 1727204247.61726: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204247.61730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204247.61732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.61735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204247.61738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.61791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204247.61796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204247.61879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204247.63636: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204247.63738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204247.63863: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp5chxyvos /root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/AnsiballZ_service_facts.py <<< 44109 1727204247.63867: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/AnsiballZ_service_facts.py" <<< 44109 1727204247.63931: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp5chxyvos" to remote "/root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/AnsiballZ_service_facts.py" <<< 44109 1727204247.64621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204247.64665: stderr chunk (state=3): >>><<< 44109 1727204247.64668: stdout chunk (state=3): >>><<< 44109 1727204247.64714: done transferring module to remote 44109 1727204247.64725: _low_level_execute_command(): starting 44109 1727204247.64729: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/ /root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/AnsiballZ_service_facts.py && sleep 0' 44109 1727204247.65178: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204247.65181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204247.65184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.65186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204247.65188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.65237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204247.65242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204247.65324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204247.67356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204247.67359: stdout chunk (state=3): >>><<< 44109 1727204247.67362: stderr chunk (state=3): >>><<< 44109 1727204247.67464: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204247.67468: _low_level_execute_command(): starting 44109 1727204247.67472: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/AnsiballZ_service_facts.py && sleep 0' 44109 1727204247.68055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204247.68092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204247.68196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204247.68218: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204247.68260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204247.68349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204249.43241: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 44109 1727204249.43338: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44109 1727204249.45232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204249.45236: stdout chunk (state=3): >>><<< 44109 1727204249.45238: stderr chunk (state=3): >>><<< 44109 1727204249.45254: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204249.47010: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204249.47014: _low_level_execute_command(): starting 44109 1727204249.47017: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204247.5798504-46238-99815106032616/ > /dev/null 2>&1 && sleep 0' 44109 1727204249.48302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204249.48321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204249.48331: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204249.48457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204249.48501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204249.50519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204249.50553: stderr chunk (state=3): >>><<< 44109 1727204249.50561: stdout chunk (state=3): >>><<< 44109 1727204249.50597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204249.50799: handler run complete 44109 1727204249.50996: variable 'ansible_facts' from source: unknown 44109 1727204249.51393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204249.52402: variable 'ansible_facts' from source: unknown 44109 1727204249.52750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204249.53065: attempt loop complete, returning result 44109 1727204249.53084: _execute() done 44109 1727204249.53091: dumping result to json 44109 1727204249.53156: done dumping result, returning 44109 1727204249.53177: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-ed67-a560-0000000004f7] 44109 1727204249.53188: sending task result for task 028d2410-947f-ed67-a560-0000000004f7 44109 1727204249.54383: done sending task result for task 028d2410-947f-ed67-a560-0000000004f7 44109 1727204249.54386: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204249.54459: no more pending results, returning what we have 44109 1727204249.54461: results queue empty 44109 1727204249.54462: checking for any_errors_fatal 44109 1727204249.54466: done checking for any_errors_fatal 44109 1727204249.54467: checking for max_fail_percentage 44109 1727204249.54469: done checking for max_fail_percentage 44109 1727204249.54469: checking to see if all hosts have failed and the running result is not ok 44109 1727204249.54471: done checking to see if all hosts have failed 44109 1727204249.54471: getting the remaining hosts for this loop 44109 1727204249.54473: done getting the remaining hosts for this loop 44109 1727204249.54478: getting the next task for host managed-node1 44109 1727204249.54484: done getting next task for host managed-node1 44109 1727204249.54487: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44109 1727204249.54490: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204249.54506: getting variables 44109 1727204249.54508: in VariableManager get_vars() 44109 1727204249.54541: Calling all_inventory to load vars for managed-node1 44109 1727204249.54544: Calling groups_inventory to load vars for managed-node1 44109 1727204249.54547: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204249.54556: Calling all_plugins_play to load vars for managed-node1 44109 1727204249.54559: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204249.54563: Calling groups_plugins_play to load vars for managed-node1 44109 1727204249.56146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204249.58620: done with get_vars() 44109 1727204249.58648: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:29 -0400 (0:00:02.049) 0:00:26.383 ***** 44109 1727204249.58756: entering _queue_task() for managed-node1/package_facts 44109 1727204249.59716: worker is 1 (out of 1 available) 44109 1727204249.59727: exiting _queue_task() for managed-node1/package_facts 44109 1727204249.59736: done queuing things up, now waiting for results queue to drain 44109 1727204249.59737: waiting for pending results... 44109 1727204249.59980: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 44109 1727204249.60399: in run() - task 028d2410-947f-ed67-a560-0000000004f8 44109 1727204249.60412: variable 'ansible_search_path' from source: unknown 44109 1727204249.60416: variable 'ansible_search_path' from source: unknown 44109 1727204249.60453: calling self._execute() 44109 1727204249.60663: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204249.60668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204249.60723: variable 'omit' from source: magic vars 44109 1727204249.61435: variable 'ansible_distribution_major_version' from source: facts 44109 1727204249.61446: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204249.61452: variable 'omit' from source: magic vars 44109 1727204249.61631: variable 'omit' from source: magic vars 44109 1727204249.61665: variable 'omit' from source: magic vars 44109 1727204249.61813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204249.61982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204249.61986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204249.61990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204249.62118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204249.62147: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204249.62150: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204249.62153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204249.62329: Set connection var ansible_connection to ssh 44109 1727204249.62343: Set connection var ansible_timeout to 10 44109 1727204249.62353: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204249.62543: Set connection var ansible_pipelining to False 44109 1727204249.62547: Set connection var ansible_shell_executable to /bin/sh 44109 1727204249.62549: Set connection var ansible_shell_type to sh 44109 1727204249.62552: variable 'ansible_shell_executable' from source: unknown 44109 1727204249.62554: variable 'ansible_connection' from source: unknown 44109 1727204249.62556: variable 'ansible_module_compression' from source: unknown 44109 1727204249.62558: variable 'ansible_shell_type' from source: unknown 44109 1727204249.62560: variable 'ansible_shell_executable' from source: unknown 44109 1727204249.62562: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204249.62564: variable 'ansible_pipelining' from source: unknown 44109 1727204249.62566: variable 'ansible_timeout' from source: unknown 44109 1727204249.62567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204249.63084: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204249.63090: variable 'omit' from source: magic vars 44109 1727204249.63092: starting attempt loop 44109 1727204249.63094: running the handler 44109 1727204249.63110: _low_level_execute_command(): starting 44109 1727204249.63126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204249.64481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204249.64511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204249.64605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204249.64665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204249.64771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204249.64920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204249.66872: stdout chunk (state=3): >>>/root <<< 44109 1727204249.67114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204249.67118: stdout chunk (state=3): >>><<< 44109 1727204249.67121: stderr chunk (state=3): >>><<< 44109 1727204249.67238: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204249.67242: _low_level_execute_command(): starting 44109 1727204249.67245: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971 `" && echo ansible-tmp-1727204249.6714544-46298-128146640283971="` echo /root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971 `" ) && sleep 0' 44109 1727204249.68481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204249.68492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204249.68503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204249.68518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204249.68530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204249.68537: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204249.68552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204249.68561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204249.68570: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204249.68578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204249.68587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204249.68600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204249.68617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204249.68620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204249.68628: stderr chunk (state=3): >>>debug2: match found <<< 44109 1727204249.68638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204249.68942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204249.69048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204249.71143: stdout chunk (state=3): >>>ansible-tmp-1727204249.6714544-46298-128146640283971=/root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971 <<< 44109 1727204249.71566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204249.71569: stdout chunk (state=3): >>><<< 44109 1727204249.71572: stderr chunk (state=3): >>><<< 44109 1727204249.71589: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204249.6714544-46298-128146640283971=/root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204249.71634: variable 'ansible_module_compression' from source: unknown 44109 1727204249.71684: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44109 1727204249.71763: variable 'ansible_facts' from source: unknown 44109 1727204249.72326: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/AnsiballZ_package_facts.py 44109 1727204249.72451: Sending initial data 44109 1727204249.72488: Sent initial data (162 bytes) 44109 1727204249.73883: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204249.74055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204249.74059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204249.74143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204249.75889: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44109 1727204249.75921: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204249.75992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204249.76084: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpakm2_jdu /root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/AnsiballZ_package_facts.py <<< 44109 1727204249.76161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/AnsiballZ_package_facts.py" <<< 44109 1727204249.76190: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpakm2_jdu" to remote "/root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/AnsiballZ_package_facts.py" <<< 44109 1727204249.79453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204249.79457: stderr chunk (state=3): >>><<< 44109 1727204249.79460: stdout chunk (state=3): >>><<< 44109 1727204249.79462: done transferring module to remote 44109 1727204249.79464: _low_level_execute_command(): starting 44109 1727204249.79467: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/ /root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/AnsiballZ_package_facts.py && sleep 0' 44109 1727204249.80546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204249.80560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204249.80825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204249.80837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204249.80994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204249.82942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204249.82957: stderr chunk (state=3): >>><<< 44109 1727204249.82972: stdout chunk (state=3): >>><<< 44109 1727204249.83078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204249.83082: _low_level_execute_command(): starting 44109 1727204249.83085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/AnsiballZ_package_facts.py && sleep 0' 44109 1727204249.84325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204249.84329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204249.84331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204249.84334: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204249.84435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204249.84500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204249.84609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204250.31755: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 44109 1727204250.31789: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 44109 1727204250.31800: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 44109 1727204250.31839: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "<<< 44109 1727204250.31856: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 44109 1727204250.31893: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 44109 1727204250.31909: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 44109 1727204250.31934: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 44109 1727204250.31946: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cl<<< 44109 1727204250.31952: stdout chunk (state=3): >>>oud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44109 1727204250.34212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204250.34217: stderr chunk (state=3): >>><<< 44109 1727204250.34219: stdout chunk (state=3): >>><<< 44109 1727204250.34288: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204250.36640: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204250.36657: _low_level_execute_command(): starting 44109 1727204250.36660: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204249.6714544-46298-128146640283971/ > /dev/null 2>&1 && sleep 0' 44109 1727204250.37144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204250.37148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204250.37151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204250.37153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204250.37204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204250.37207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204250.37209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204250.37303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204250.39336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204250.39339: stdout chunk (state=3): >>><<< 44109 1727204250.39368: stderr chunk (state=3): >>><<< 44109 1727204250.39371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204250.39373: handler run complete 44109 1727204250.39842: variable 'ansible_facts' from source: unknown 44109 1727204250.40110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204250.41147: variable 'ansible_facts' from source: unknown 44109 1727204250.41602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204250.42146: attempt loop complete, returning result 44109 1727204250.42156: _execute() done 44109 1727204250.42158: dumping result to json 44109 1727204250.42355: done dumping result, returning 44109 1727204250.42365: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-ed67-a560-0000000004f8] 44109 1727204250.42368: sending task result for task 028d2410-947f-ed67-a560-0000000004f8 44109 1727204250.43835: done sending task result for task 028d2410-947f-ed67-a560-0000000004f8 44109 1727204250.43839: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204250.43920: no more pending results, returning what we have 44109 1727204250.43922: results queue empty 44109 1727204250.43923: checking for any_errors_fatal 44109 1727204250.43928: done checking for any_errors_fatal 44109 1727204250.43929: checking for max_fail_percentage 44109 1727204250.43930: done checking for max_fail_percentage 44109 1727204250.43931: checking to see if all hosts have failed and the running result is not ok 44109 1727204250.43932: done checking to see if all hosts have failed 44109 1727204250.43933: getting the remaining hosts for this loop 44109 1727204250.43934: done getting the remaining hosts for this loop 44109 1727204250.43936: getting the next task for host managed-node1 44109 1727204250.43941: done getting next task for host managed-node1 44109 1727204250.43943: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44109 1727204250.43944: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204250.43951: getting variables 44109 1727204250.43952: in VariableManager get_vars() 44109 1727204250.43973: Calling all_inventory to load vars for managed-node1 44109 1727204250.43977: Calling groups_inventory to load vars for managed-node1 44109 1727204250.43979: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204250.43985: Calling all_plugins_play to load vars for managed-node1 44109 1727204250.43987: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204250.43989: Calling groups_plugins_play to load vars for managed-node1 44109 1727204250.44832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204250.46416: done with get_vars() 44109 1727204250.46440: done getting variables 44109 1727204250.46503: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.877) 0:00:27.261 ***** 44109 1727204250.46535: entering _queue_task() for managed-node1/debug 44109 1727204250.46878: worker is 1 (out of 1 available) 44109 1727204250.46890: exiting _queue_task() for managed-node1/debug 44109 1727204250.46902: done queuing things up, now waiting for results queue to drain 44109 1727204250.46903: waiting for pending results... 44109 1727204250.47298: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 44109 1727204250.47309: in run() - task 028d2410-947f-ed67-a560-000000000072 44109 1727204250.47332: variable 'ansible_search_path' from source: unknown 44109 1727204250.47339: variable 'ansible_search_path' from source: unknown 44109 1727204250.47380: calling self._execute() 44109 1727204250.47490: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204250.47507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204250.47523: variable 'omit' from source: magic vars 44109 1727204250.47915: variable 'ansible_distribution_major_version' from source: facts 44109 1727204250.47935: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204250.47947: variable 'omit' from source: magic vars 44109 1727204250.47990: variable 'omit' from source: magic vars 44109 1727204250.48094: variable 'network_provider' from source: set_fact 44109 1727204250.48118: variable 'omit' from source: magic vars 44109 1727204250.48167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204250.48209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204250.48237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204250.48266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204250.48287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204250.48328: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204250.48339: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204250.48350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204250.48460: Set connection var ansible_connection to ssh 44109 1727204250.48470: Set connection var ansible_timeout to 10 44109 1727204250.48487: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204250.48589: Set connection var ansible_pipelining to False 44109 1727204250.48592: Set connection var ansible_shell_executable to /bin/sh 44109 1727204250.48594: Set connection var ansible_shell_type to sh 44109 1727204250.48597: variable 'ansible_shell_executable' from source: unknown 44109 1727204250.48599: variable 'ansible_connection' from source: unknown 44109 1727204250.48601: variable 'ansible_module_compression' from source: unknown 44109 1727204250.48603: variable 'ansible_shell_type' from source: unknown 44109 1727204250.48605: variable 'ansible_shell_executable' from source: unknown 44109 1727204250.48607: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204250.48609: variable 'ansible_pipelining' from source: unknown 44109 1727204250.48611: variable 'ansible_timeout' from source: unknown 44109 1727204250.48618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204250.48733: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204250.48749: variable 'omit' from source: magic vars 44109 1727204250.48757: starting attempt loop 44109 1727204250.48763: running the handler 44109 1727204250.48818: handler run complete 44109 1727204250.48837: attempt loop complete, returning result 44109 1727204250.48846: _execute() done 44109 1727204250.48853: dumping result to json 44109 1727204250.48860: done dumping result, returning 44109 1727204250.48871: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-ed67-a560-000000000072] 44109 1727204250.48883: sending task result for task 028d2410-947f-ed67-a560-000000000072 ok: [managed-node1] => {} MSG: Using network provider: nm 44109 1727204250.49080: no more pending results, returning what we have 44109 1727204250.49084: results queue empty 44109 1727204250.49085: checking for any_errors_fatal 44109 1727204250.49098: done checking for any_errors_fatal 44109 1727204250.49098: checking for max_fail_percentage 44109 1727204250.49100: done checking for max_fail_percentage 44109 1727204250.49101: checking to see if all hosts have failed and the running result is not ok 44109 1727204250.49102: done checking to see if all hosts have failed 44109 1727204250.49102: getting the remaining hosts for this loop 44109 1727204250.49104: done getting the remaining hosts for this loop 44109 1727204250.49108: getting the next task for host managed-node1 44109 1727204250.49116: done getting next task for host managed-node1 44109 1727204250.49119: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44109 1727204250.49121: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204250.49131: getting variables 44109 1727204250.49133: in VariableManager get_vars() 44109 1727204250.49168: Calling all_inventory to load vars for managed-node1 44109 1727204250.49171: Calling groups_inventory to load vars for managed-node1 44109 1727204250.49173: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204250.49280: done sending task result for task 028d2410-947f-ed67-a560-000000000072 44109 1727204250.49283: WORKER PROCESS EXITING 44109 1727204250.49293: Calling all_plugins_play to load vars for managed-node1 44109 1727204250.49296: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204250.49299: Calling groups_plugins_play to load vars for managed-node1 44109 1727204250.50909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204250.53381: done with get_vars() 44109 1727204250.53415: done getting variables 44109 1727204250.53471: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.071) 0:00:27.333 ***** 44109 1727204250.53678: entering _queue_task() for managed-node1/fail 44109 1727204250.54202: worker is 1 (out of 1 available) 44109 1727204250.54215: exiting _queue_task() for managed-node1/fail 44109 1727204250.54228: done queuing things up, now waiting for results queue to drain 44109 1727204250.54228: waiting for pending results... 44109 1727204250.54544: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44109 1727204250.54727: in run() - task 028d2410-947f-ed67-a560-000000000073 44109 1727204250.54756: variable 'ansible_search_path' from source: unknown 44109 1727204250.54765: variable 'ansible_search_path' from source: unknown 44109 1727204250.54849: calling self._execute() 44109 1727204250.54968: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204250.55040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204250.55044: variable 'omit' from source: magic vars 44109 1727204250.55437: variable 'ansible_distribution_major_version' from source: facts 44109 1727204250.55454: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204250.55603: variable 'network_state' from source: role '' defaults 44109 1727204250.55619: Evaluated conditional (network_state != {}): False 44109 1727204250.55627: when evaluation is False, skipping this task 44109 1727204250.55636: _execute() done 44109 1727204250.55643: dumping result to json 44109 1727204250.55651: done dumping result, returning 44109 1727204250.55661: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-ed67-a560-000000000073] 44109 1727204250.55671: sending task result for task 028d2410-947f-ed67-a560-000000000073 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204250.55992: no more pending results, returning what we have 44109 1727204250.55998: results queue empty 44109 1727204250.55999: checking for any_errors_fatal 44109 1727204250.56005: done checking for any_errors_fatal 44109 1727204250.56006: checking for max_fail_percentage 44109 1727204250.56009: done checking for max_fail_percentage 44109 1727204250.56009: checking to see if all hosts have failed and the running result is not ok 44109 1727204250.56010: done checking to see if all hosts have failed 44109 1727204250.56011: getting the remaining hosts for this loop 44109 1727204250.56013: done getting the remaining hosts for this loop 44109 1727204250.56016: getting the next task for host managed-node1 44109 1727204250.56484: done getting next task for host managed-node1 44109 1727204250.56488: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44109 1727204250.56491: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204250.56504: getting variables 44109 1727204250.56506: in VariableManager get_vars() 44109 1727204250.56540: Calling all_inventory to load vars for managed-node1 44109 1727204250.56543: Calling groups_inventory to load vars for managed-node1 44109 1727204250.56545: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204250.56553: Calling all_plugins_play to load vars for managed-node1 44109 1727204250.56555: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204250.56557: Calling groups_plugins_play to load vars for managed-node1 44109 1727204250.57382: done sending task result for task 028d2410-947f-ed67-a560-000000000073 44109 1727204250.57386: WORKER PROCESS EXITING 44109 1727204250.58888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204250.60522: done with get_vars() 44109 1727204250.60546: done getting variables 44109 1727204250.60606: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.069) 0:00:27.402 ***** 44109 1727204250.60639: entering _queue_task() for managed-node1/fail 44109 1727204250.61187: worker is 1 (out of 1 available) 44109 1727204250.61196: exiting _queue_task() for managed-node1/fail 44109 1727204250.61206: done queuing things up, now waiting for results queue to drain 44109 1727204250.61207: waiting for pending results... 44109 1727204250.61303: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44109 1727204250.61420: in run() - task 028d2410-947f-ed67-a560-000000000074 44109 1727204250.61444: variable 'ansible_search_path' from source: unknown 44109 1727204250.61452: variable 'ansible_search_path' from source: unknown 44109 1727204250.61494: calling self._execute() 44109 1727204250.61606: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204250.61621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204250.61635: variable 'omit' from source: magic vars 44109 1727204250.62023: variable 'ansible_distribution_major_version' from source: facts 44109 1727204250.62039: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204250.62161: variable 'network_state' from source: role '' defaults 44109 1727204250.62182: Evaluated conditional (network_state != {}): False 44109 1727204250.62195: when evaluation is False, skipping this task 44109 1727204250.62203: _execute() done 44109 1727204250.62211: dumping result to json 44109 1727204250.62222: done dumping result, returning 44109 1727204250.62233: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-ed67-a560-000000000074] 44109 1727204250.62243: sending task result for task 028d2410-947f-ed67-a560-000000000074 44109 1727204250.62481: done sending task result for task 028d2410-947f-ed67-a560-000000000074 44109 1727204250.62484: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204250.62536: no more pending results, returning what we have 44109 1727204250.62540: results queue empty 44109 1727204250.62541: checking for any_errors_fatal 44109 1727204250.62549: done checking for any_errors_fatal 44109 1727204250.62550: checking for max_fail_percentage 44109 1727204250.62552: done checking for max_fail_percentage 44109 1727204250.62553: checking to see if all hosts have failed and the running result is not ok 44109 1727204250.62554: done checking to see if all hosts have failed 44109 1727204250.62554: getting the remaining hosts for this loop 44109 1727204250.62556: done getting the remaining hosts for this loop 44109 1727204250.62560: getting the next task for host managed-node1 44109 1727204250.62566: done getting next task for host managed-node1 44109 1727204250.62569: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44109 1727204250.62571: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204250.62591: getting variables 44109 1727204250.62593: in VariableManager get_vars() 44109 1727204250.62634: Calling all_inventory to load vars for managed-node1 44109 1727204250.62637: Calling groups_inventory to load vars for managed-node1 44109 1727204250.62639: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204250.62649: Calling all_plugins_play to load vars for managed-node1 44109 1727204250.62651: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204250.62654: Calling groups_plugins_play to load vars for managed-node1 44109 1727204250.64241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204250.67608: done with get_vars() 44109 1727204250.67647: done getting variables 44109 1727204250.67919: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.073) 0:00:27.475 ***** 44109 1727204250.67950: entering _queue_task() for managed-node1/fail 44109 1727204250.68717: worker is 1 (out of 1 available) 44109 1727204250.68728: exiting _queue_task() for managed-node1/fail 44109 1727204250.68736: done queuing things up, now waiting for results queue to drain 44109 1727204250.68737: waiting for pending results... 44109 1727204250.69042: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44109 1727204250.69383: in run() - task 028d2410-947f-ed67-a560-000000000075 44109 1727204250.69490: variable 'ansible_search_path' from source: unknown 44109 1727204250.69494: variable 'ansible_search_path' from source: unknown 44109 1727204250.69497: calling self._execute() 44109 1727204250.69549: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204250.69606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204250.69623: variable 'omit' from source: magic vars 44109 1727204250.70340: variable 'ansible_distribution_major_version' from source: facts 44109 1727204250.70483: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204250.70768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204250.76182: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204250.76186: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204250.76582: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204250.76586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204250.76588: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204250.76591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204250.76594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204250.76596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.76812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204250.76833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204250.76936: variable 'ansible_distribution_major_version' from source: facts 44109 1727204250.76955: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44109 1727204250.77073: variable 'ansible_distribution' from source: facts 44109 1727204250.77287: variable '__network_rh_distros' from source: role '' defaults 44109 1727204250.77301: Evaluated conditional (ansible_distribution in __network_rh_distros): True 44109 1727204250.77549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204250.77828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204250.77858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.78180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204250.78184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204250.78190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204250.78217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204250.78246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.78289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204250.78310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204250.78580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204250.78583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204250.78585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.78610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204250.78626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204250.79343: variable 'network_connections' from source: play vars 44109 1727204250.79362: variable 'profile' from source: play vars 44109 1727204250.79447: variable 'profile' from source: play vars 44109 1727204250.79458: variable 'interface' from source: set_fact 44109 1727204250.79525: variable 'interface' from source: set_fact 44109 1727204250.79695: variable 'network_state' from source: role '' defaults 44109 1727204250.79769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204250.80056: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204250.80223: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204250.80260: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204250.80357: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204250.80410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204250.80452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204250.80487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.80524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204250.80553: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 44109 1727204250.80561: when evaluation is False, skipping this task 44109 1727204250.80569: _execute() done 44109 1727204250.80578: dumping result to json 44109 1727204250.80586: done dumping result, returning 44109 1727204250.80606: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-ed67-a560-000000000075] 44109 1727204250.80616: sending task result for task 028d2410-947f-ed67-a560-000000000075 44109 1727204250.80895: done sending task result for task 028d2410-947f-ed67-a560-000000000075 44109 1727204250.80898: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 44109 1727204250.80948: no more pending results, returning what we have 44109 1727204250.80953: results queue empty 44109 1727204250.80954: checking for any_errors_fatal 44109 1727204250.80962: done checking for any_errors_fatal 44109 1727204250.80962: checking for max_fail_percentage 44109 1727204250.80965: done checking for max_fail_percentage 44109 1727204250.80966: checking to see if all hosts have failed and the running result is not ok 44109 1727204250.80967: done checking to see if all hosts have failed 44109 1727204250.80967: getting the remaining hosts for this loop 44109 1727204250.80969: done getting the remaining hosts for this loop 44109 1727204250.80973: getting the next task for host managed-node1 44109 1727204250.80986: done getting next task for host managed-node1 44109 1727204250.80990: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44109 1727204250.80992: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204250.81008: getting variables 44109 1727204250.81010: in VariableManager get_vars() 44109 1727204250.81049: Calling all_inventory to load vars for managed-node1 44109 1727204250.81052: Calling groups_inventory to load vars for managed-node1 44109 1727204250.81054: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204250.81065: Calling all_plugins_play to load vars for managed-node1 44109 1727204250.81068: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204250.81071: Calling groups_plugins_play to load vars for managed-node1 44109 1727204250.83085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204250.86291: done with get_vars() 44109 1727204250.86328: done getting variables 44109 1727204250.86600: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.186) 0:00:27.662 ***** 44109 1727204250.86632: entering _queue_task() for managed-node1/dnf 44109 1727204250.87416: worker is 1 (out of 1 available) 44109 1727204250.87427: exiting _queue_task() for managed-node1/dnf 44109 1727204250.87437: done queuing things up, now waiting for results queue to drain 44109 1727204250.87438: waiting for pending results... 44109 1727204250.87812: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44109 1727204250.88183: in run() - task 028d2410-947f-ed67-a560-000000000076 44109 1727204250.88188: variable 'ansible_search_path' from source: unknown 44109 1727204250.88191: variable 'ansible_search_path' from source: unknown 44109 1727204250.88284: calling self._execute() 44109 1727204250.88530: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204250.88535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204250.88537: variable 'omit' from source: magic vars 44109 1727204250.89455: variable 'ansible_distribution_major_version' from source: facts 44109 1727204250.89459: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204250.89981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204250.95164: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204250.95169: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204250.95385: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204250.95434: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204250.95496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204250.95724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204250.95850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204250.95883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.95992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204250.96016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204250.96318: variable 'ansible_distribution' from source: facts 44109 1727204250.96330: variable 'ansible_distribution_major_version' from source: facts 44109 1727204250.96351: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44109 1727204250.96882: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204250.96885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204250.96887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204250.96889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.97125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204250.97145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204250.97189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204250.97345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204250.97377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.97428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204250.97448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204250.97580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204250.97610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204250.97696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.97743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204250.97845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204250.98240: variable 'network_connections' from source: play vars 44109 1727204250.98309: variable 'profile' from source: play vars 44109 1727204250.98440: variable 'profile' from source: play vars 44109 1727204250.98496: variable 'interface' from source: set_fact 44109 1727204250.98634: variable 'interface' from source: set_fact 44109 1727204250.98791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204250.99139: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204250.99481: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204250.99484: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204250.99487: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204250.99543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204250.99635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204250.99746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204250.99779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204250.99880: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204251.00454: variable 'network_connections' from source: play vars 44109 1727204251.00683: variable 'profile' from source: play vars 44109 1727204251.00687: variable 'profile' from source: play vars 44109 1727204251.00689: variable 'interface' from source: set_fact 44109 1727204251.00840: variable 'interface' from source: set_fact 44109 1727204251.00871: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204251.00905: when evaluation is False, skipping this task 44109 1727204251.00916: _execute() done 44109 1727204251.00924: dumping result to json 44109 1727204251.00986: done dumping result, returning 44109 1727204251.00999: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-ed67-a560-000000000076] 44109 1727204251.01017: sending task result for task 028d2410-947f-ed67-a560-000000000076 44109 1727204251.01236: done sending task result for task 028d2410-947f-ed67-a560-000000000076 44109 1727204251.01240: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204251.01292: no more pending results, returning what we have 44109 1727204251.01297: results queue empty 44109 1727204251.01298: checking for any_errors_fatal 44109 1727204251.01304: done checking for any_errors_fatal 44109 1727204251.01305: checking for max_fail_percentage 44109 1727204251.01307: done checking for max_fail_percentage 44109 1727204251.01308: checking to see if all hosts have failed and the running result is not ok 44109 1727204251.01309: done checking to see if all hosts have failed 44109 1727204251.01309: getting the remaining hosts for this loop 44109 1727204251.01311: done getting the remaining hosts for this loop 44109 1727204251.01314: getting the next task for host managed-node1 44109 1727204251.01320: done getting next task for host managed-node1 44109 1727204251.01324: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44109 1727204251.01326: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204251.01592: getting variables 44109 1727204251.01594: in VariableManager get_vars() 44109 1727204251.01631: Calling all_inventory to load vars for managed-node1 44109 1727204251.01634: Calling groups_inventory to load vars for managed-node1 44109 1727204251.01637: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204251.01646: Calling all_plugins_play to load vars for managed-node1 44109 1727204251.01649: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204251.01651: Calling groups_plugins_play to load vars for managed-node1 44109 1727204251.04488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204251.08007: done with get_vars() 44109 1727204251.08043: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44109 1727204251.08164: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.217) 0:00:27.879 ***** 44109 1727204251.08352: entering _queue_task() for managed-node1/yum 44109 1727204251.09048: worker is 1 (out of 1 available) 44109 1727204251.09061: exiting _queue_task() for managed-node1/yum 44109 1727204251.09071: done queuing things up, now waiting for results queue to drain 44109 1727204251.09072: waiting for pending results... 44109 1727204251.09517: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44109 1727204251.09689: in run() - task 028d2410-947f-ed67-a560-000000000077 44109 1727204251.09727: variable 'ansible_search_path' from source: unknown 44109 1727204251.09741: variable 'ansible_search_path' from source: unknown 44109 1727204251.09835: calling self._execute() 44109 1727204251.09961: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.09973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.09992: variable 'omit' from source: magic vars 44109 1727204251.10444: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.10462: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204251.10652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204251.13101: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204251.13181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204251.13228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204251.13267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204251.13305: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204251.13388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.13581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.13584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.13587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.13589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.13618: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.13640: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44109 1727204251.13648: when evaluation is False, skipping this task 44109 1727204251.13655: _execute() done 44109 1727204251.13662: dumping result to json 44109 1727204251.13669: done dumping result, returning 44109 1727204251.13684: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-ed67-a560-000000000077] 44109 1727204251.13695: sending task result for task 028d2410-947f-ed67-a560-000000000077 44109 1727204251.13806: done sending task result for task 028d2410-947f-ed67-a560-000000000077 44109 1727204251.13813: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44109 1727204251.13898: no more pending results, returning what we have 44109 1727204251.13902: results queue empty 44109 1727204251.13903: checking for any_errors_fatal 44109 1727204251.13915: done checking for any_errors_fatal 44109 1727204251.13915: checking for max_fail_percentage 44109 1727204251.13917: done checking for max_fail_percentage 44109 1727204251.13918: checking to see if all hosts have failed and the running result is not ok 44109 1727204251.13919: done checking to see if all hosts have failed 44109 1727204251.13920: getting the remaining hosts for this loop 44109 1727204251.13921: done getting the remaining hosts for this loop 44109 1727204251.13925: getting the next task for host managed-node1 44109 1727204251.14000: done getting next task for host managed-node1 44109 1727204251.14007: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44109 1727204251.14009: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204251.14066: getting variables 44109 1727204251.14068: in VariableManager get_vars() 44109 1727204251.14104: Calling all_inventory to load vars for managed-node1 44109 1727204251.14107: Calling groups_inventory to load vars for managed-node1 44109 1727204251.14109: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204251.14120: Calling all_plugins_play to load vars for managed-node1 44109 1727204251.14122: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204251.14125: Calling groups_plugins_play to load vars for managed-node1 44109 1727204251.15594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204251.17236: done with get_vars() 44109 1727204251.17272: done getting variables 44109 1727204251.17341: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.090) 0:00:27.970 ***** 44109 1727204251.17373: entering _queue_task() for managed-node1/fail 44109 1727204251.17915: worker is 1 (out of 1 available) 44109 1727204251.17927: exiting _queue_task() for managed-node1/fail 44109 1727204251.17939: done queuing things up, now waiting for results queue to drain 44109 1727204251.17940: waiting for pending results... 44109 1727204251.18444: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44109 1727204251.18784: in run() - task 028d2410-947f-ed67-a560-000000000078 44109 1727204251.18788: variable 'ansible_search_path' from source: unknown 44109 1727204251.18790: variable 'ansible_search_path' from source: unknown 44109 1727204251.18826: calling self._execute() 44109 1727204251.18995: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.19000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.19006: variable 'omit' from source: magic vars 44109 1727204251.19696: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.19714: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204251.19845: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204251.20057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204251.22367: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204251.22451: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204251.22516: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204251.22541: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204251.22574: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204251.22682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.22780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.22833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.22964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.22971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.22974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.23042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.23121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.23162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.23184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.23251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.23280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.23314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.23697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.23700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.23867: variable 'network_connections' from source: play vars 44109 1727204251.23986: variable 'profile' from source: play vars 44109 1727204251.24203: variable 'profile' from source: play vars 44109 1727204251.24229: variable 'interface' from source: set_fact 44109 1727204251.24398: variable 'interface' from source: set_fact 44109 1727204251.24531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204251.24971: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204251.25136: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204251.25173: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204251.25256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204251.25310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204251.25347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204251.25379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.25410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204251.25481: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204251.25773: variable 'network_connections' from source: play vars 44109 1727204251.25788: variable 'profile' from source: play vars 44109 1727204251.25860: variable 'profile' from source: play vars 44109 1727204251.25874: variable 'interface' from source: set_fact 44109 1727204251.25942: variable 'interface' from source: set_fact 44109 1727204251.25981: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204251.25991: when evaluation is False, skipping this task 44109 1727204251.25998: _execute() done 44109 1727204251.26081: dumping result to json 44109 1727204251.26086: done dumping result, returning 44109 1727204251.26091: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-ed67-a560-000000000078] 44109 1727204251.26102: sending task result for task 028d2410-947f-ed67-a560-000000000078 44109 1727204251.26383: done sending task result for task 028d2410-947f-ed67-a560-000000000078 44109 1727204251.26387: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204251.26441: no more pending results, returning what we have 44109 1727204251.26445: results queue empty 44109 1727204251.26446: checking for any_errors_fatal 44109 1727204251.26453: done checking for any_errors_fatal 44109 1727204251.26454: checking for max_fail_percentage 44109 1727204251.26456: done checking for max_fail_percentage 44109 1727204251.26457: checking to see if all hosts have failed and the running result is not ok 44109 1727204251.26458: done checking to see if all hosts have failed 44109 1727204251.26459: getting the remaining hosts for this loop 44109 1727204251.26460: done getting the remaining hosts for this loop 44109 1727204251.26463: getting the next task for host managed-node1 44109 1727204251.26469: done getting next task for host managed-node1 44109 1727204251.26473: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44109 1727204251.26477: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204251.26493: getting variables 44109 1727204251.26494: in VariableManager get_vars() 44109 1727204251.26536: Calling all_inventory to load vars for managed-node1 44109 1727204251.26539: Calling groups_inventory to load vars for managed-node1 44109 1727204251.26541: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204251.26551: Calling all_plugins_play to load vars for managed-node1 44109 1727204251.26554: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204251.26557: Calling groups_plugins_play to load vars for managed-node1 44109 1727204251.28102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204251.29772: done with get_vars() 44109 1727204251.29804: done getting variables 44109 1727204251.29866: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.125) 0:00:28.095 ***** 44109 1727204251.29901: entering _queue_task() for managed-node1/package 44109 1727204251.30264: worker is 1 (out of 1 available) 44109 1727204251.30480: exiting _queue_task() for managed-node1/package 44109 1727204251.30490: done queuing things up, now waiting for results queue to drain 44109 1727204251.30491: waiting for pending results... 44109 1727204251.30578: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 44109 1727204251.30713: in run() - task 028d2410-947f-ed67-a560-000000000079 44109 1727204251.30740: variable 'ansible_search_path' from source: unknown 44109 1727204251.30748: variable 'ansible_search_path' from source: unknown 44109 1727204251.30792: calling self._execute() 44109 1727204251.30902: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.30914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.30930: variable 'omit' from source: magic vars 44109 1727204251.31322: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.31377: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204251.31548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204251.31837: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204251.31887: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204251.31932: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204251.32016: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204251.32243: variable 'network_packages' from source: role '' defaults 44109 1727204251.32258: variable '__network_provider_setup' from source: role '' defaults 44109 1727204251.32274: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204251.32348: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204251.32369: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204251.32433: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204251.32631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204251.34757: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204251.34865: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204251.34879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204251.34916: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204251.34948: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204251.35043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.35088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.35280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.35284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.35287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.35289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.35291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.35293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.35332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.35351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.35603: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44109 1727204251.35738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.35769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.35801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.35851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.35869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.35977: variable 'ansible_python' from source: facts 44109 1727204251.36012: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44109 1727204251.36110: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204251.36205: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204251.36385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.36407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.36480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.36484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.36508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.36561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.36608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.36637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.36819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.36822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.37026: variable 'network_connections' from source: play vars 44109 1727204251.37286: variable 'profile' from source: play vars 44109 1727204251.37302: variable 'profile' from source: play vars 44109 1727204251.37312: variable 'interface' from source: set_fact 44109 1727204251.37455: variable 'interface' from source: set_fact 44109 1727204251.37580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204251.37751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204251.37790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.37885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204251.38083: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204251.38668: variable 'network_connections' from source: play vars 44109 1727204251.38685: variable 'profile' from source: play vars 44109 1727204251.38881: variable 'profile' from source: play vars 44109 1727204251.38899: variable 'interface' from source: set_fact 44109 1727204251.39032: variable 'interface' from source: set_fact 44109 1727204251.39144: variable '__network_packages_default_wireless' from source: role '' defaults 44109 1727204251.39282: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204251.39897: variable 'network_connections' from source: play vars 44109 1727204251.39951: variable 'profile' from source: play vars 44109 1727204251.40183: variable 'profile' from source: play vars 44109 1727204251.40186: variable 'interface' from source: set_fact 44109 1727204251.40341: variable 'interface' from source: set_fact 44109 1727204251.40373: variable '__network_packages_default_team' from source: role '' defaults 44109 1727204251.40532: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204251.41245: variable 'network_connections' from source: play vars 44109 1727204251.41255: variable 'profile' from source: play vars 44109 1727204251.41389: variable 'profile' from source: play vars 44109 1727204251.41599: variable 'interface' from source: set_fact 44109 1727204251.41601: variable 'interface' from source: set_fact 44109 1727204251.41764: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204251.41893: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204251.41908: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204251.42041: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204251.42344: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44109 1727204251.43095: variable 'network_connections' from source: play vars 44109 1727204251.43109: variable 'profile' from source: play vars 44109 1727204251.43185: variable 'profile' from source: play vars 44109 1727204251.43203: variable 'interface' from source: set_fact 44109 1727204251.43284: variable 'interface' from source: set_fact 44109 1727204251.43300: variable 'ansible_distribution' from source: facts 44109 1727204251.43310: variable '__network_rh_distros' from source: role '' defaults 44109 1727204251.43322: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.43350: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44109 1727204251.43537: variable 'ansible_distribution' from source: facts 44109 1727204251.43548: variable '__network_rh_distros' from source: role '' defaults 44109 1727204251.43567: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.43591: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44109 1727204251.43764: variable 'ansible_distribution' from source: facts 44109 1727204251.43774: variable '__network_rh_distros' from source: role '' defaults 44109 1727204251.43795: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.43841: variable 'network_provider' from source: set_fact 44109 1727204251.43863: variable 'ansible_facts' from source: unknown 44109 1727204251.44506: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44109 1727204251.44510: when evaluation is False, skipping this task 44109 1727204251.44515: _execute() done 44109 1727204251.44517: dumping result to json 44109 1727204251.44519: done dumping result, returning 44109 1727204251.44524: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-ed67-a560-000000000079] 44109 1727204251.44531: sending task result for task 028d2410-947f-ed67-a560-000000000079 44109 1727204251.44623: done sending task result for task 028d2410-947f-ed67-a560-000000000079 44109 1727204251.44625: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44109 1727204251.44799: no more pending results, returning what we have 44109 1727204251.44802: results queue empty 44109 1727204251.44803: checking for any_errors_fatal 44109 1727204251.44810: done checking for any_errors_fatal 44109 1727204251.44811: checking for max_fail_percentage 44109 1727204251.44815: done checking for max_fail_percentage 44109 1727204251.44815: checking to see if all hosts have failed and the running result is not ok 44109 1727204251.44816: done checking to see if all hosts have failed 44109 1727204251.44817: getting the remaining hosts for this loop 44109 1727204251.44818: done getting the remaining hosts for this loop 44109 1727204251.44821: getting the next task for host managed-node1 44109 1727204251.44827: done getting next task for host managed-node1 44109 1727204251.44830: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44109 1727204251.44832: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204251.44844: getting variables 44109 1727204251.44845: in VariableManager get_vars() 44109 1727204251.44882: Calling all_inventory to load vars for managed-node1 44109 1727204251.44884: Calling groups_inventory to load vars for managed-node1 44109 1727204251.44886: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204251.44899: Calling all_plugins_play to load vars for managed-node1 44109 1727204251.44902: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204251.44904: Calling groups_plugins_play to load vars for managed-node1 44109 1727204251.46203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204251.47546: done with get_vars() 44109 1727204251.47566: done getting variables 44109 1727204251.47615: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.177) 0:00:28.272 ***** 44109 1727204251.47637: entering _queue_task() for managed-node1/package 44109 1727204251.47894: worker is 1 (out of 1 available) 44109 1727204251.47910: exiting _queue_task() for managed-node1/package 44109 1727204251.47924: done queuing things up, now waiting for results queue to drain 44109 1727204251.47925: waiting for pending results... 44109 1727204251.48107: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44109 1727204251.48180: in run() - task 028d2410-947f-ed67-a560-00000000007a 44109 1727204251.48193: variable 'ansible_search_path' from source: unknown 44109 1727204251.48196: variable 'ansible_search_path' from source: unknown 44109 1727204251.48225: calling self._execute() 44109 1727204251.48302: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.48307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.48317: variable 'omit' from source: magic vars 44109 1727204251.48594: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.48600: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204251.48769: variable 'network_state' from source: role '' defaults 44109 1727204251.48773: Evaluated conditional (network_state != {}): False 44109 1727204251.48778: when evaluation is False, skipping this task 44109 1727204251.48781: _execute() done 44109 1727204251.48783: dumping result to json 44109 1727204251.48785: done dumping result, returning 44109 1727204251.48787: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-ed67-a560-00000000007a] 44109 1727204251.48789: sending task result for task 028d2410-947f-ed67-a560-00000000007a 44109 1727204251.48850: done sending task result for task 028d2410-947f-ed67-a560-00000000007a 44109 1727204251.48853: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204251.48906: no more pending results, returning what we have 44109 1727204251.48910: results queue empty 44109 1727204251.48914: checking for any_errors_fatal 44109 1727204251.48921: done checking for any_errors_fatal 44109 1727204251.48922: checking for max_fail_percentage 44109 1727204251.48924: done checking for max_fail_percentage 44109 1727204251.48925: checking to see if all hosts have failed and the running result is not ok 44109 1727204251.48926: done checking to see if all hosts have failed 44109 1727204251.48927: getting the remaining hosts for this loop 44109 1727204251.48929: done getting the remaining hosts for this loop 44109 1727204251.48932: getting the next task for host managed-node1 44109 1727204251.48939: done getting next task for host managed-node1 44109 1727204251.48942: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44109 1727204251.48945: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204251.48964: getting variables 44109 1727204251.48966: in VariableManager get_vars() 44109 1727204251.49016: Calling all_inventory to load vars for managed-node1 44109 1727204251.49019: Calling groups_inventory to load vars for managed-node1 44109 1727204251.49021: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204251.49031: Calling all_plugins_play to load vars for managed-node1 44109 1727204251.49033: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204251.49036: Calling groups_plugins_play to load vars for managed-node1 44109 1727204251.50454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204251.51342: done with get_vars() 44109 1727204251.51364: done getting variables 44109 1727204251.51415: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.037) 0:00:28.310 ***** 44109 1727204251.51438: entering _queue_task() for managed-node1/package 44109 1727204251.51707: worker is 1 (out of 1 available) 44109 1727204251.51725: exiting _queue_task() for managed-node1/package 44109 1727204251.51735: done queuing things up, now waiting for results queue to drain 44109 1727204251.51736: waiting for pending results... 44109 1727204251.51922: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44109 1727204251.51996: in run() - task 028d2410-947f-ed67-a560-00000000007b 44109 1727204251.52008: variable 'ansible_search_path' from source: unknown 44109 1727204251.52014: variable 'ansible_search_path' from source: unknown 44109 1727204251.52040: calling self._execute() 44109 1727204251.52125: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.52129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.52136: variable 'omit' from source: magic vars 44109 1727204251.52419: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.52427: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204251.52516: variable 'network_state' from source: role '' defaults 44109 1727204251.52523: Evaluated conditional (network_state != {}): False 44109 1727204251.52526: when evaluation is False, skipping this task 44109 1727204251.52529: _execute() done 44109 1727204251.52531: dumping result to json 44109 1727204251.52535: done dumping result, returning 44109 1727204251.52543: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-ed67-a560-00000000007b] 44109 1727204251.52547: sending task result for task 028d2410-947f-ed67-a560-00000000007b 44109 1727204251.52641: done sending task result for task 028d2410-947f-ed67-a560-00000000007b 44109 1727204251.52644: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204251.52694: no more pending results, returning what we have 44109 1727204251.52697: results queue empty 44109 1727204251.52698: checking for any_errors_fatal 44109 1727204251.52708: done checking for any_errors_fatal 44109 1727204251.52708: checking for max_fail_percentage 44109 1727204251.52710: done checking for max_fail_percentage 44109 1727204251.52713: checking to see if all hosts have failed and the running result is not ok 44109 1727204251.52714: done checking to see if all hosts have failed 44109 1727204251.52715: getting the remaining hosts for this loop 44109 1727204251.52716: done getting the remaining hosts for this loop 44109 1727204251.52719: getting the next task for host managed-node1 44109 1727204251.52725: done getting next task for host managed-node1 44109 1727204251.52728: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44109 1727204251.52731: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204251.52747: getting variables 44109 1727204251.52748: in VariableManager get_vars() 44109 1727204251.52786: Calling all_inventory to load vars for managed-node1 44109 1727204251.52789: Calling groups_inventory to load vars for managed-node1 44109 1727204251.52791: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204251.52801: Calling all_plugins_play to load vars for managed-node1 44109 1727204251.52803: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204251.52805: Calling groups_plugins_play to load vars for managed-node1 44109 1727204251.53638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204251.54559: done with get_vars() 44109 1727204251.54589: done getting variables 44109 1727204251.54638: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.032) 0:00:28.342 ***** 44109 1727204251.54662: entering _queue_task() for managed-node1/service 44109 1727204251.54932: worker is 1 (out of 1 available) 44109 1727204251.54946: exiting _queue_task() for managed-node1/service 44109 1727204251.54958: done queuing things up, now waiting for results queue to drain 44109 1727204251.54959: waiting for pending results... 44109 1727204251.55162: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44109 1727204251.55246: in run() - task 028d2410-947f-ed67-a560-00000000007c 44109 1727204251.55259: variable 'ansible_search_path' from source: unknown 44109 1727204251.55262: variable 'ansible_search_path' from source: unknown 44109 1727204251.55302: calling self._execute() 44109 1727204251.55379: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.55384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.55391: variable 'omit' from source: magic vars 44109 1727204251.55690: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.55700: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204251.55789: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204251.55929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204251.63985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204251.63989: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204251.64027: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204251.64051: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204251.64074: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204251.64139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.64175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.64199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.64242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.64250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.64292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.64313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.64340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.64381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.64481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.64484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.64487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.64505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.64555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.64577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.64781: variable 'network_connections' from source: play vars 44109 1727204251.64798: variable 'profile' from source: play vars 44109 1727204251.64906: variable 'profile' from source: play vars 44109 1727204251.64920: variable 'interface' from source: set_fact 44109 1727204251.64996: variable 'interface' from source: set_fact 44109 1727204251.65182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204251.65256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204251.65301: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204251.65340: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204251.65374: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204251.65444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204251.65472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204251.65505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.65541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204251.65586: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204251.65849: variable 'network_connections' from source: play vars 44109 1727204251.65859: variable 'profile' from source: play vars 44109 1727204251.65934: variable 'profile' from source: play vars 44109 1727204251.65944: variable 'interface' from source: set_fact 44109 1727204251.66021: variable 'interface' from source: set_fact 44109 1727204251.66057: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204251.66066: when evaluation is False, skipping this task 44109 1727204251.66073: _execute() done 44109 1727204251.66180: dumping result to json 44109 1727204251.66183: done dumping result, returning 44109 1727204251.66186: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-ed67-a560-00000000007c] 44109 1727204251.66196: sending task result for task 028d2410-947f-ed67-a560-00000000007c 44109 1727204251.66265: done sending task result for task 028d2410-947f-ed67-a560-00000000007c 44109 1727204251.66269: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204251.66324: no more pending results, returning what we have 44109 1727204251.66327: results queue empty 44109 1727204251.66329: checking for any_errors_fatal 44109 1727204251.66335: done checking for any_errors_fatal 44109 1727204251.66336: checking for max_fail_percentage 44109 1727204251.66338: done checking for max_fail_percentage 44109 1727204251.66339: checking to see if all hosts have failed and the running result is not ok 44109 1727204251.66340: done checking to see if all hosts have failed 44109 1727204251.66340: getting the remaining hosts for this loop 44109 1727204251.66342: done getting the remaining hosts for this loop 44109 1727204251.66345: getting the next task for host managed-node1 44109 1727204251.66351: done getting next task for host managed-node1 44109 1727204251.66355: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44109 1727204251.66357: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204251.66370: getting variables 44109 1727204251.66372: in VariableManager get_vars() 44109 1727204251.66414: Calling all_inventory to load vars for managed-node1 44109 1727204251.66417: Calling groups_inventory to load vars for managed-node1 44109 1727204251.66420: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204251.66430: Calling all_plugins_play to load vars for managed-node1 44109 1727204251.66433: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204251.66437: Calling groups_plugins_play to load vars for managed-node1 44109 1727204251.74085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204251.75649: done with get_vars() 44109 1727204251.75686: done getting variables 44109 1727204251.75741: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.211) 0:00:28.553 ***** 44109 1727204251.75767: entering _queue_task() for managed-node1/service 44109 1727204251.76128: worker is 1 (out of 1 available) 44109 1727204251.76140: exiting _queue_task() for managed-node1/service 44109 1727204251.76150: done queuing things up, now waiting for results queue to drain 44109 1727204251.76151: waiting for pending results... 44109 1727204251.76599: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44109 1727204251.76604: in run() - task 028d2410-947f-ed67-a560-00000000007d 44109 1727204251.76608: variable 'ansible_search_path' from source: unknown 44109 1727204251.76610: variable 'ansible_search_path' from source: unknown 44109 1727204251.76645: calling self._execute() 44109 1727204251.76766: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.76783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.76798: variable 'omit' from source: magic vars 44109 1727204251.77215: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.77349: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204251.77418: variable 'network_provider' from source: set_fact 44109 1727204251.77430: variable 'network_state' from source: role '' defaults 44109 1727204251.77446: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44109 1727204251.77463: variable 'omit' from source: magic vars 44109 1727204251.77511: variable 'omit' from source: magic vars 44109 1727204251.77557: variable 'network_service_name' from source: role '' defaults 44109 1727204251.77640: variable 'network_service_name' from source: role '' defaults 44109 1727204251.77760: variable '__network_provider_setup' from source: role '' defaults 44109 1727204251.77772: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204251.77846: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204251.77860: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204251.77934: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204251.78219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204251.80665: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204251.80794: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204251.80929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204251.80978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204251.81014: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204251.81106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.81162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.81581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.81585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.81587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.81589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.81591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.81593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.81594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.81597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.82070: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44109 1727204251.82397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.82432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.82505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.82578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.82601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.82721: variable 'ansible_python' from source: facts 44109 1727204251.82761: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44109 1727204251.82910: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204251.82968: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204251.83116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.83153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.83187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.83279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.83286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.83306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204251.83345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204251.83372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.83429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204251.83447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204251.83615: variable 'network_connections' from source: play vars 44109 1727204251.83618: variable 'profile' from source: play vars 44109 1727204251.83681: variable 'profile' from source: play vars 44109 1727204251.83692: variable 'interface' from source: set_fact 44109 1727204251.83829: variable 'interface' from source: set_fact 44109 1727204251.83864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204251.84091: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204251.84157: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204251.84209: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204251.84263: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204251.84335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204251.84367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204251.84409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204251.84447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204251.84503: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204251.84865: variable 'network_connections' from source: play vars 44109 1727204251.84922: variable 'profile' from source: play vars 44109 1727204251.84958: variable 'profile' from source: play vars 44109 1727204251.84969: variable 'interface' from source: set_fact 44109 1727204251.85038: variable 'interface' from source: set_fact 44109 1727204251.85079: variable '__network_packages_default_wireless' from source: role '' defaults 44109 1727204251.85168: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204251.85446: variable 'network_connections' from source: play vars 44109 1727204251.85456: variable 'profile' from source: play vars 44109 1727204251.85574: variable 'profile' from source: play vars 44109 1727204251.85580: variable 'interface' from source: set_fact 44109 1727204251.85610: variable 'interface' from source: set_fact 44109 1727204251.85644: variable '__network_packages_default_team' from source: role '' defaults 44109 1727204251.85723: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204251.85989: variable 'network_connections' from source: play vars 44109 1727204251.85998: variable 'profile' from source: play vars 44109 1727204251.86063: variable 'profile' from source: play vars 44109 1727204251.86072: variable 'interface' from source: set_fact 44109 1727204251.86148: variable 'interface' from source: set_fact 44109 1727204251.86223: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204251.86265: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204251.86279: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204251.86440: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204251.86561: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44109 1727204251.87044: variable 'network_connections' from source: play vars 44109 1727204251.87054: variable 'profile' from source: play vars 44109 1727204251.87122: variable 'profile' from source: play vars 44109 1727204251.87130: variable 'interface' from source: set_fact 44109 1727204251.87203: variable 'interface' from source: set_fact 44109 1727204251.87220: variable 'ansible_distribution' from source: facts 44109 1727204251.87228: variable '__network_rh_distros' from source: role '' defaults 44109 1727204251.87236: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.87252: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44109 1727204251.87434: variable 'ansible_distribution' from source: facts 44109 1727204251.87443: variable '__network_rh_distros' from source: role '' defaults 44109 1727204251.87451: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.87468: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44109 1727204251.87654: variable 'ansible_distribution' from source: facts 44109 1727204251.87662: variable '__network_rh_distros' from source: role '' defaults 44109 1727204251.87670: variable 'ansible_distribution_major_version' from source: facts 44109 1727204251.87710: variable 'network_provider' from source: set_fact 44109 1727204251.87742: variable 'omit' from source: magic vars 44109 1727204251.87856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204251.87860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204251.87863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204251.87865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204251.87867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204251.87982: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204251.87986: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.87988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.88023: Set connection var ansible_connection to ssh 44109 1727204251.88034: Set connection var ansible_timeout to 10 44109 1727204251.88043: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204251.88053: Set connection var ansible_pipelining to False 44109 1727204251.88061: Set connection var ansible_shell_executable to /bin/sh 44109 1727204251.88069: Set connection var ansible_shell_type to sh 44109 1727204251.88101: variable 'ansible_shell_executable' from source: unknown 44109 1727204251.88109: variable 'ansible_connection' from source: unknown 44109 1727204251.88118: variable 'ansible_module_compression' from source: unknown 44109 1727204251.88124: variable 'ansible_shell_type' from source: unknown 44109 1727204251.88130: variable 'ansible_shell_executable' from source: unknown 44109 1727204251.88136: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204251.88148: variable 'ansible_pipelining' from source: unknown 44109 1727204251.88154: variable 'ansible_timeout' from source: unknown 44109 1727204251.88163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204251.88271: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204251.88290: variable 'omit' from source: magic vars 44109 1727204251.88419: starting attempt loop 44109 1727204251.88423: running the handler 44109 1727204251.88425: variable 'ansible_facts' from source: unknown 44109 1727204251.89231: _low_level_execute_command(): starting 44109 1727204251.89243: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204251.90062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204251.90094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204251.90390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204251.90487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204251.92281: stdout chunk (state=3): >>>/root <<< 44109 1727204251.92415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204251.92428: stdout chunk (state=3): >>><<< 44109 1727204251.92441: stderr chunk (state=3): >>><<< 44109 1727204251.92682: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204251.92685: _low_level_execute_command(): starting 44109 1727204251.92688: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454 `" && echo ansible-tmp-1727204251.9258404-46372-40915808129454="` echo /root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454 `" ) && sleep 0' 44109 1727204251.94190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204251.94209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204251.94216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204251.94342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204251.94519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204251.94623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204251.96708: stdout chunk (state=3): >>>ansible-tmp-1727204251.9258404-46372-40915808129454=/root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454 <<< 44109 1727204251.96856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204251.96860: stdout chunk (state=3): >>><<< 44109 1727204251.96867: stderr chunk (state=3): >>><<< 44109 1727204251.96888: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204251.9258404-46372-40915808129454=/root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204251.96933: variable 'ansible_module_compression' from source: unknown 44109 1727204251.97070: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44109 1727204251.97072: variable 'ansible_facts' from source: unknown 44109 1727204251.97651: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/AnsiballZ_systemd.py 44109 1727204251.97984: Sending initial data 44109 1727204251.97988: Sent initial data (155 bytes) 44109 1727204251.99462: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204251.99486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204251.99501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204251.99523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204251.99539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204251.99645: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204251.99893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204252.00006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204252.01752: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 44109 1727204252.01764: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204252.02072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204252.02171: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/AnsiballZ_systemd.py" <<< 44109 1727204252.02174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp5w6ef51r /root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/AnsiballZ_systemd.py <<< 44109 1727204252.02226: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp5w6ef51r" to remote "/root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/AnsiballZ_systemd.py" <<< 44109 1727204252.05334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204252.05356: stderr chunk (state=3): >>><<< 44109 1727204252.05359: stdout chunk (state=3): >>><<< 44109 1727204252.05379: done transferring module to remote 44109 1727204252.05391: _low_level_execute_command(): starting 44109 1727204252.05396: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/ /root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/AnsiballZ_systemd.py && sleep 0' 44109 1727204252.06509: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204252.06885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204252.06890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204252.06892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204252.06956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204252.09071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204252.09131: stderr chunk (state=3): >>><<< 44109 1727204252.09143: stdout chunk (state=3): >>><<< 44109 1727204252.09168: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204252.09183: _low_level_execute_command(): starting 44109 1727204252.09192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/AnsiballZ_systemd.py && sleep 0' 44109 1727204252.09837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204252.09896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204252.09953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204252.09971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204252.10274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204252.41229: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10772480", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3288502272", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1793388000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 44109 1727204252.41250: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "networ<<< 44109 1727204252.41258: stdout chunk (state=3): >>>k-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44109 1727204252.43506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204252.43532: stderr chunk (state=3): >>><<< 44109 1727204252.43535: stdout chunk (state=3): >>><<< 44109 1727204252.43552: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10772480", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3288502272", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1793388000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204252.43679: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204252.43696: _low_level_execute_command(): starting 44109 1727204252.43699: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204251.9258404-46372-40915808129454/ > /dev/null 2>&1 && sleep 0' 44109 1727204252.44144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204252.44148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204252.44150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204252.44152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204252.44203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204252.44207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204252.44209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204252.44295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204252.46343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204252.46347: stdout chunk (state=3): >>><<< 44109 1727204252.46349: stderr chunk (state=3): >>><<< 44109 1727204252.46559: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204252.46562: handler run complete 44109 1727204252.46564: attempt loop complete, returning result 44109 1727204252.46566: _execute() done 44109 1727204252.46568: dumping result to json 44109 1727204252.46703: done dumping result, returning 44109 1727204252.46783: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-ed67-a560-00000000007d] 44109 1727204252.46786: sending task result for task 028d2410-947f-ed67-a560-00000000007d ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204252.47537: no more pending results, returning what we have 44109 1727204252.47540: results queue empty 44109 1727204252.47541: checking for any_errors_fatal 44109 1727204252.47550: done checking for any_errors_fatal 44109 1727204252.47551: checking for max_fail_percentage 44109 1727204252.47553: done checking for max_fail_percentage 44109 1727204252.47554: checking to see if all hosts have failed and the running result is not ok 44109 1727204252.47555: done checking to see if all hosts have failed 44109 1727204252.47555: getting the remaining hosts for this loop 44109 1727204252.47557: done getting the remaining hosts for this loop 44109 1727204252.47561: getting the next task for host managed-node1 44109 1727204252.47566: done getting next task for host managed-node1 44109 1727204252.47570: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44109 1727204252.47571: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204252.47584: getting variables 44109 1727204252.47586: in VariableManager get_vars() 44109 1727204252.47621: Calling all_inventory to load vars for managed-node1 44109 1727204252.47624: Calling groups_inventory to load vars for managed-node1 44109 1727204252.47627: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204252.47636: Calling all_plugins_play to load vars for managed-node1 44109 1727204252.47639: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204252.47642: Calling groups_plugins_play to load vars for managed-node1 44109 1727204252.48790: done sending task result for task 028d2410-947f-ed67-a560-00000000007d 44109 1727204252.48794: WORKER PROCESS EXITING 44109 1727204252.49753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204252.51592: done with get_vars() 44109 1727204252.51619: done getting variables 44109 1727204252.51680: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.759) 0:00:29.313 ***** 44109 1727204252.51717: entering _queue_task() for managed-node1/service 44109 1727204252.52068: worker is 1 (out of 1 available) 44109 1727204252.52236: exiting _queue_task() for managed-node1/service 44109 1727204252.52246: done queuing things up, now waiting for results queue to drain 44109 1727204252.52247: waiting for pending results... 44109 1727204252.52401: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44109 1727204252.52531: in run() - task 028d2410-947f-ed67-a560-00000000007e 44109 1727204252.52550: variable 'ansible_search_path' from source: unknown 44109 1727204252.52562: variable 'ansible_search_path' from source: unknown 44109 1727204252.52608: calling self._execute() 44109 1727204252.52724: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204252.52738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204252.52751: variable 'omit' from source: magic vars 44109 1727204252.53166: variable 'ansible_distribution_major_version' from source: facts 44109 1727204252.53185: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204252.53321: variable 'network_provider' from source: set_fact 44109 1727204252.53337: Evaluated conditional (network_provider == "nm"): True 44109 1727204252.53434: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204252.53560: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204252.53743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204252.56078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204252.56162: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204252.56221: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204252.56399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204252.56402: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204252.56455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204252.56494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204252.56533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204252.56578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204252.56623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204252.56848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204252.56851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204252.56854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204252.56856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204252.56858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204252.56910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204252.56940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204252.56980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204252.57027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204252.57045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204252.57207: variable 'network_connections' from source: play vars 44109 1727204252.57229: variable 'profile' from source: play vars 44109 1727204252.57317: variable 'profile' from source: play vars 44109 1727204252.57327: variable 'interface' from source: set_fact 44109 1727204252.57406: variable 'interface' from source: set_fact 44109 1727204252.57474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204252.57718: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204252.57721: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204252.57746: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204252.57779: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204252.57830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204252.57861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204252.57893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204252.57926: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204252.57988: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204252.58271: variable 'network_connections' from source: play vars 44109 1727204252.58485: variable 'profile' from source: play vars 44109 1727204252.58489: variable 'profile' from source: play vars 44109 1727204252.58491: variable 'interface' from source: set_fact 44109 1727204252.58493: variable 'interface' from source: set_fact 44109 1727204252.58495: Evaluated conditional (__network_wpa_supplicant_required): False 44109 1727204252.58497: when evaluation is False, skipping this task 44109 1727204252.58499: _execute() done 44109 1727204252.58509: dumping result to json 44109 1727204252.58511: done dumping result, returning 44109 1727204252.58516: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-ed67-a560-00000000007e] 44109 1727204252.58518: sending task result for task 028d2410-947f-ed67-a560-00000000007e skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44109 1727204252.58792: no more pending results, returning what we have 44109 1727204252.58796: results queue empty 44109 1727204252.58797: checking for any_errors_fatal 44109 1727204252.58819: done checking for any_errors_fatal 44109 1727204252.58820: checking for max_fail_percentage 44109 1727204252.58821: done checking for max_fail_percentage 44109 1727204252.58822: checking to see if all hosts have failed and the running result is not ok 44109 1727204252.58823: done checking to see if all hosts have failed 44109 1727204252.58824: getting the remaining hosts for this loop 44109 1727204252.58826: done getting the remaining hosts for this loop 44109 1727204252.58829: getting the next task for host managed-node1 44109 1727204252.58835: done getting next task for host managed-node1 44109 1727204252.58839: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44109 1727204252.58841: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204252.58856: getting variables 44109 1727204252.58858: in VariableManager get_vars() 44109 1727204252.59196: Calling all_inventory to load vars for managed-node1 44109 1727204252.59200: Calling groups_inventory to load vars for managed-node1 44109 1727204252.59202: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204252.59287: Calling all_plugins_play to load vars for managed-node1 44109 1727204252.59290: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204252.59293: Calling groups_plugins_play to load vars for managed-node1 44109 1727204252.60286: done sending task result for task 028d2410-947f-ed67-a560-00000000007e 44109 1727204252.60289: WORKER PROCESS EXITING 44109 1727204252.62288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204252.65574: done with get_vars() 44109 1727204252.65605: done getting variables 44109 1727204252.65668: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.141) 0:00:29.455 ***** 44109 1727204252.65902: entering _queue_task() for managed-node1/service 44109 1727204252.66459: worker is 1 (out of 1 available) 44109 1727204252.66471: exiting _queue_task() for managed-node1/service 44109 1727204252.66686: done queuing things up, now waiting for results queue to drain 44109 1727204252.66688: waiting for pending results... 44109 1727204252.67068: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 44109 1727204252.67484: in run() - task 028d2410-947f-ed67-a560-00000000007f 44109 1727204252.67488: variable 'ansible_search_path' from source: unknown 44109 1727204252.67491: variable 'ansible_search_path' from source: unknown 44109 1727204252.67494: calling self._execute() 44109 1727204252.67598: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204252.67610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204252.67625: variable 'omit' from source: magic vars 44109 1727204252.68044: variable 'ansible_distribution_major_version' from source: facts 44109 1727204252.68061: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204252.68187: variable 'network_provider' from source: set_fact 44109 1727204252.68197: Evaluated conditional (network_provider == "initscripts"): False 44109 1727204252.68204: when evaluation is False, skipping this task 44109 1727204252.68212: _execute() done 44109 1727204252.68223: dumping result to json 44109 1727204252.68231: done dumping result, returning 44109 1727204252.68243: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-ed67-a560-00000000007f] 44109 1727204252.68253: sending task result for task 028d2410-947f-ed67-a560-00000000007f skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204252.68414: no more pending results, returning what we have 44109 1727204252.68419: results queue empty 44109 1727204252.68420: checking for any_errors_fatal 44109 1727204252.68429: done checking for any_errors_fatal 44109 1727204252.68429: checking for max_fail_percentage 44109 1727204252.68431: done checking for max_fail_percentage 44109 1727204252.68432: checking to see if all hosts have failed and the running result is not ok 44109 1727204252.68433: done checking to see if all hosts have failed 44109 1727204252.68434: getting the remaining hosts for this loop 44109 1727204252.68435: done getting the remaining hosts for this loop 44109 1727204252.68438: getting the next task for host managed-node1 44109 1727204252.68444: done getting next task for host managed-node1 44109 1727204252.68447: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44109 1727204252.68450: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204252.68467: getting variables 44109 1727204252.68469: in VariableManager get_vars() 44109 1727204252.68509: Calling all_inventory to load vars for managed-node1 44109 1727204252.68513: Calling groups_inventory to load vars for managed-node1 44109 1727204252.68516: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204252.68528: Calling all_plugins_play to load vars for managed-node1 44109 1727204252.68531: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204252.68534: Calling groups_plugins_play to load vars for managed-node1 44109 1727204252.69064: done sending task result for task 028d2410-947f-ed67-a560-00000000007f 44109 1727204252.69068: WORKER PROCESS EXITING 44109 1727204252.70269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204252.71896: done with get_vars() 44109 1727204252.71926: done getting variables 44109 1727204252.71990: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.061) 0:00:29.516 ***** 44109 1727204252.72022: entering _queue_task() for managed-node1/copy 44109 1727204252.72383: worker is 1 (out of 1 available) 44109 1727204252.72508: exiting _queue_task() for managed-node1/copy 44109 1727204252.72518: done queuing things up, now waiting for results queue to drain 44109 1727204252.72519: waiting for pending results... 44109 1727204252.73297: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44109 1727204252.73302: in run() - task 028d2410-947f-ed67-a560-000000000080 44109 1727204252.73305: variable 'ansible_search_path' from source: unknown 44109 1727204252.73308: variable 'ansible_search_path' from source: unknown 44109 1727204252.73347: calling self._execute() 44109 1727204252.73614: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204252.73627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204252.73645: variable 'omit' from source: magic vars 44109 1727204252.74148: variable 'ansible_distribution_major_version' from source: facts 44109 1727204252.74168: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204252.74327: variable 'network_provider' from source: set_fact 44109 1727204252.74334: Evaluated conditional (network_provider == "initscripts"): False 44109 1727204252.74337: when evaluation is False, skipping this task 44109 1727204252.74340: _execute() done 44109 1727204252.74343: dumping result to json 44109 1727204252.74346: done dumping result, returning 44109 1727204252.74354: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-ed67-a560-000000000080] 44109 1727204252.74357: sending task result for task 028d2410-947f-ed67-a560-000000000080 44109 1727204252.74458: done sending task result for task 028d2410-947f-ed67-a560-000000000080 44109 1727204252.74461: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44109 1727204252.74526: no more pending results, returning what we have 44109 1727204252.74530: results queue empty 44109 1727204252.74531: checking for any_errors_fatal 44109 1727204252.74537: done checking for any_errors_fatal 44109 1727204252.74537: checking for max_fail_percentage 44109 1727204252.74539: done checking for max_fail_percentage 44109 1727204252.74540: checking to see if all hosts have failed and the running result is not ok 44109 1727204252.74541: done checking to see if all hosts have failed 44109 1727204252.74541: getting the remaining hosts for this loop 44109 1727204252.74543: done getting the remaining hosts for this loop 44109 1727204252.74546: getting the next task for host managed-node1 44109 1727204252.74552: done getting next task for host managed-node1 44109 1727204252.74555: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44109 1727204252.74557: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204252.74570: getting variables 44109 1727204252.74571: in VariableManager get_vars() 44109 1727204252.74615: Calling all_inventory to load vars for managed-node1 44109 1727204252.74618: Calling groups_inventory to load vars for managed-node1 44109 1727204252.74620: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204252.74631: Calling all_plugins_play to load vars for managed-node1 44109 1727204252.74633: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204252.74636: Calling groups_plugins_play to load vars for managed-node1 44109 1727204252.76003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204252.78386: done with get_vars() 44109 1727204252.78416: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.064) 0:00:29.581 ***** 44109 1727204252.78504: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44109 1727204252.79003: worker is 1 (out of 1 available) 44109 1727204252.79013: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44109 1727204252.79023: done queuing things up, now waiting for results queue to drain 44109 1727204252.79024: waiting for pending results... 44109 1727204252.79264: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44109 1727204252.79305: in run() - task 028d2410-947f-ed67-a560-000000000081 44109 1727204252.79326: variable 'ansible_search_path' from source: unknown 44109 1727204252.79333: variable 'ansible_search_path' from source: unknown 44109 1727204252.79382: calling self._execute() 44109 1727204252.79487: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204252.79498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204252.79514: variable 'omit' from source: magic vars 44109 1727204252.79968: variable 'ansible_distribution_major_version' from source: facts 44109 1727204252.79987: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204252.79997: variable 'omit' from source: magic vars 44109 1727204252.80052: variable 'omit' from source: magic vars 44109 1727204252.80226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204252.83501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204252.83524: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204252.83563: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204252.83608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204252.83640: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204252.83727: variable 'network_provider' from source: set_fact 44109 1727204252.83870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204252.83910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204252.83980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204252.83993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204252.84011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204252.84093: variable 'omit' from source: magic vars 44109 1727204252.84226: variable 'omit' from source: magic vars 44109 1727204252.84341: variable 'network_connections' from source: play vars 44109 1727204252.84478: variable 'profile' from source: play vars 44109 1727204252.84481: variable 'profile' from source: play vars 44109 1727204252.84484: variable 'interface' from source: set_fact 44109 1727204252.84511: variable 'interface' from source: set_fact 44109 1727204252.84655: variable 'omit' from source: magic vars 44109 1727204252.84667: variable '__lsr_ansible_managed' from source: task vars 44109 1727204252.84733: variable '__lsr_ansible_managed' from source: task vars 44109 1727204252.85023: Loaded config def from plugin (lookup/template) 44109 1727204252.85032: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44109 1727204252.85060: File lookup term: get_ansible_managed.j2 44109 1727204252.85067: variable 'ansible_search_path' from source: unknown 44109 1727204252.85078: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44109 1727204252.85094: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44109 1727204252.85116: variable 'ansible_search_path' from source: unknown 44109 1727204252.93034: variable 'ansible_managed' from source: unknown 44109 1727204252.93196: variable 'omit' from source: magic vars 44109 1727204252.93231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204252.93268: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204252.93293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204252.93313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204252.93328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204252.93363: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204252.93468: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204252.93471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204252.93480: Set connection var ansible_connection to ssh 44109 1727204252.93491: Set connection var ansible_timeout to 10 44109 1727204252.93500: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204252.93512: Set connection var ansible_pipelining to False 44109 1727204252.93521: Set connection var ansible_shell_executable to /bin/sh 44109 1727204252.93530: Set connection var ansible_shell_type to sh 44109 1727204252.93555: variable 'ansible_shell_executable' from source: unknown 44109 1727204252.93562: variable 'ansible_connection' from source: unknown 44109 1727204252.93568: variable 'ansible_module_compression' from source: unknown 44109 1727204252.93582: variable 'ansible_shell_type' from source: unknown 44109 1727204252.93589: variable 'ansible_shell_executable' from source: unknown 44109 1727204252.93595: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204252.93602: variable 'ansible_pipelining' from source: unknown 44109 1727204252.93608: variable 'ansible_timeout' from source: unknown 44109 1727204252.93616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204252.93752: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204252.93780: variable 'omit' from source: magic vars 44109 1727204252.93795: starting attempt loop 44109 1727204252.93802: running the handler 44109 1727204252.93899: _low_level_execute_command(): starting 44109 1727204252.93902: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204252.94531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204252.94559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204252.94590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204252.94672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204252.94693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204252.94712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204252.94728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204252.94851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204252.96658: stdout chunk (state=3): >>>/root <<< 44109 1727204252.96799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204252.96802: stdout chunk (state=3): >>><<< 44109 1727204252.96814: stderr chunk (state=3): >>><<< 44109 1727204252.96984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204252.96989: _low_level_execute_command(): starting 44109 1727204252.96992: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608 `" && echo ansible-tmp-1727204252.9683423-46418-91832117277608="` echo /root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608 `" ) && sleep 0' 44109 1727204252.97494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204252.97506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204252.97524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204252.97536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204252.97548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204252.97555: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204252.97564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204252.97579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204252.97595: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204252.97632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204252.97635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204252.97638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204252.97640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204252.97642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204252.97645: stderr chunk (state=3): >>>debug2: match found <<< 44109 1727204252.97652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204252.97761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204252.97771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204252.97853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204252.99946: stdout chunk (state=3): >>>ansible-tmp-1727204252.9683423-46418-91832117277608=/root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608 <<< 44109 1727204253.00120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204253.00123: stdout chunk (state=3): >>><<< 44109 1727204253.00125: stderr chunk (state=3): >>><<< 44109 1727204253.00143: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204252.9683423-46418-91832117277608=/root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204253.00198: variable 'ansible_module_compression' from source: unknown 44109 1727204253.00281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44109 1727204253.00303: variable 'ansible_facts' from source: unknown 44109 1727204253.00419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/AnsiballZ_network_connections.py 44109 1727204253.00669: Sending initial data 44109 1727204253.00672: Sent initial data (167 bytes) 44109 1727204253.01299: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.01325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204253.01343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.01363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.01478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204253.03235: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204253.03323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204253.03400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpg5mzyqa1 /root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/AnsiballZ_network_connections.py <<< 44109 1727204253.03403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/AnsiballZ_network_connections.py" <<< 44109 1727204253.03510: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpg5mzyqa1" to remote "/root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/AnsiballZ_network_connections.py" <<< 44109 1727204253.04661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204253.04781: stderr chunk (state=3): >>><<< 44109 1727204253.04785: stdout chunk (state=3): >>><<< 44109 1727204253.04787: done transferring module to remote 44109 1727204253.04790: _low_level_execute_command(): starting 44109 1727204253.04792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/ /root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/AnsiballZ_network_connections.py && sleep 0' 44109 1727204253.05361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204253.05383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204253.05399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204253.05419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204253.05437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204253.05457: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.05488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204253.05580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.05645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.05721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204253.07721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204253.07725: stdout chunk (state=3): >>><<< 44109 1727204253.07727: stderr chunk (state=3): >>><<< 44109 1727204253.07821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204253.07824: _low_level_execute_command(): starting 44109 1727204253.07827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/AnsiballZ_network_connections.py && sleep 0' 44109 1727204253.08357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204253.08371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204253.08388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204253.08404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204253.08420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204253.08495: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.08533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204253.08550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.08572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.08690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204253.41044: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44109 1727204253.43500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204253.43549: stderr chunk (state=3): >>><<< 44109 1727204253.43552: stdout chunk (state=3): >>><<< 44109 1727204253.43580: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204253.43625: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204253.43635: _low_level_execute_command(): starting 44109 1727204253.43640: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204252.9683423-46418-91832117277608/ > /dev/null 2>&1 && sleep 0' 44109 1727204253.44540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204253.44543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204253.44630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.44634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204253.44637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204253.44639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.44641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204253.44643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.44690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.44774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204253.46854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204253.46868: stderr chunk (state=3): >>><<< 44109 1727204253.46878: stdout chunk (state=3): >>><<< 44109 1727204253.46902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204253.47081: handler run complete 44109 1727204253.47085: attempt loop complete, returning result 44109 1727204253.47087: _execute() done 44109 1727204253.47089: dumping result to json 44109 1727204253.47092: done dumping result, returning 44109 1727204253.47094: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-ed67-a560-000000000081] 44109 1727204253.47096: sending task result for task 028d2410-947f-ed67-a560-000000000081 changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 44109 1727204253.47244: done sending task result for task 028d2410-947f-ed67-a560-000000000081 44109 1727204253.47317: WORKER PROCESS EXITING 44109 1727204253.47326: no more pending results, returning what we have 44109 1727204253.47330: results queue empty 44109 1727204253.47331: checking for any_errors_fatal 44109 1727204253.47336: done checking for any_errors_fatal 44109 1727204253.47337: checking for max_fail_percentage 44109 1727204253.47338: done checking for max_fail_percentage 44109 1727204253.47339: checking to see if all hosts have failed and the running result is not ok 44109 1727204253.47340: done checking to see if all hosts have failed 44109 1727204253.47341: getting the remaining hosts for this loop 44109 1727204253.47343: done getting the remaining hosts for this loop 44109 1727204253.47346: getting the next task for host managed-node1 44109 1727204253.47351: done getting next task for host managed-node1 44109 1727204253.47355: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44109 1727204253.47356: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204253.47365: getting variables 44109 1727204253.47366: in VariableManager get_vars() 44109 1727204253.47496: Calling all_inventory to load vars for managed-node1 44109 1727204253.47499: Calling groups_inventory to load vars for managed-node1 44109 1727204253.47501: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204253.47510: Calling all_plugins_play to load vars for managed-node1 44109 1727204253.47512: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204253.47515: Calling groups_plugins_play to load vars for managed-node1 44109 1727204253.49051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204253.50134: done with get_vars() 44109 1727204253.50152: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.717) 0:00:30.298 ***** 44109 1727204253.50218: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44109 1727204253.50464: worker is 1 (out of 1 available) 44109 1727204253.50478: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44109 1727204253.50489: done queuing things up, now waiting for results queue to drain 44109 1727204253.50490: waiting for pending results... 44109 1727204253.50680: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 44109 1727204253.50765: in run() - task 028d2410-947f-ed67-a560-000000000082 44109 1727204253.50777: variable 'ansible_search_path' from source: unknown 44109 1727204253.50781: variable 'ansible_search_path' from source: unknown 44109 1727204253.50810: calling self._execute() 44109 1727204253.50891: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.50895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.50903: variable 'omit' from source: magic vars 44109 1727204253.51287: variable 'ansible_distribution_major_version' from source: facts 44109 1727204253.51306: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204253.51445: variable 'network_state' from source: role '' defaults 44109 1727204253.51460: Evaluated conditional (network_state != {}): False 44109 1727204253.51467: when evaluation is False, skipping this task 44109 1727204253.51482: _execute() done 44109 1727204253.51489: dumping result to json 44109 1727204253.51501: done dumping result, returning 44109 1727204253.51518: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-ed67-a560-000000000082] 44109 1727204253.51527: sending task result for task 028d2410-947f-ed67-a560-000000000082 44109 1727204253.51662: done sending task result for task 028d2410-947f-ed67-a560-000000000082 44109 1727204253.51665: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204253.51811: no more pending results, returning what we have 44109 1727204253.51817: results queue empty 44109 1727204253.51818: checking for any_errors_fatal 44109 1727204253.51837: done checking for any_errors_fatal 44109 1727204253.51838: checking for max_fail_percentage 44109 1727204253.51840: done checking for max_fail_percentage 44109 1727204253.51842: checking to see if all hosts have failed and the running result is not ok 44109 1727204253.51842: done checking to see if all hosts have failed 44109 1727204253.51843: getting the remaining hosts for this loop 44109 1727204253.51845: done getting the remaining hosts for this loop 44109 1727204253.51849: getting the next task for host managed-node1 44109 1727204253.51855: done getting next task for host managed-node1 44109 1727204253.51859: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44109 1727204253.51861: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204253.51879: getting variables 44109 1727204253.51881: in VariableManager get_vars() 44109 1727204253.51925: Calling all_inventory to load vars for managed-node1 44109 1727204253.51929: Calling groups_inventory to load vars for managed-node1 44109 1727204253.51931: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204253.52062: Calling all_plugins_play to load vars for managed-node1 44109 1727204253.52065: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204253.52069: Calling groups_plugins_play to load vars for managed-node1 44109 1727204253.53767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204253.55851: done with get_vars() 44109 1727204253.55988: done getting variables 44109 1727204253.56057: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.058) 0:00:30.357 ***** 44109 1727204253.56094: entering _queue_task() for managed-node1/debug 44109 1727204253.56718: worker is 1 (out of 1 available) 44109 1727204253.56730: exiting _queue_task() for managed-node1/debug 44109 1727204253.56744: done queuing things up, now waiting for results queue to drain 44109 1727204253.56745: waiting for pending results... 44109 1727204253.57246: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44109 1727204253.57252: in run() - task 028d2410-947f-ed67-a560-000000000083 44109 1727204253.57255: variable 'ansible_search_path' from source: unknown 44109 1727204253.57258: variable 'ansible_search_path' from source: unknown 44109 1727204253.57284: calling self._execute() 44109 1727204253.57374: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.57380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.57391: variable 'omit' from source: magic vars 44109 1727204253.57672: variable 'ansible_distribution_major_version' from source: facts 44109 1727204253.57683: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204253.57688: variable 'omit' from source: magic vars 44109 1727204253.57719: variable 'omit' from source: magic vars 44109 1727204253.57746: variable 'omit' from source: magic vars 44109 1727204253.57780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204253.57806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204253.57825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204253.57840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204253.57850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204253.57873: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204253.57878: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.57881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.57955: Set connection var ansible_connection to ssh 44109 1727204253.57958: Set connection var ansible_timeout to 10 44109 1727204253.57964: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204253.57971: Set connection var ansible_pipelining to False 44109 1727204253.57978: Set connection var ansible_shell_executable to /bin/sh 44109 1727204253.57984: Set connection var ansible_shell_type to sh 44109 1727204253.58000: variable 'ansible_shell_executable' from source: unknown 44109 1727204253.58003: variable 'ansible_connection' from source: unknown 44109 1727204253.58006: variable 'ansible_module_compression' from source: unknown 44109 1727204253.58008: variable 'ansible_shell_type' from source: unknown 44109 1727204253.58011: variable 'ansible_shell_executable' from source: unknown 44109 1727204253.58016: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.58018: variable 'ansible_pipelining' from source: unknown 44109 1727204253.58020: variable 'ansible_timeout' from source: unknown 44109 1727204253.58022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.58127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204253.58136: variable 'omit' from source: magic vars 44109 1727204253.58139: starting attempt loop 44109 1727204253.58142: running the handler 44109 1727204253.58321: variable '__network_connections_result' from source: set_fact 44109 1727204253.58331: handler run complete 44109 1727204253.58334: attempt loop complete, returning result 44109 1727204253.58337: _execute() done 44109 1727204253.58339: dumping result to json 44109 1727204253.58341: done dumping result, returning 44109 1727204253.58344: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-ed67-a560-000000000083] 44109 1727204253.58346: sending task result for task 028d2410-947f-ed67-a560-000000000083 44109 1727204253.58447: done sending task result for task 028d2410-947f-ed67-a560-000000000083 44109 1727204253.58450: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 44109 1727204253.58508: no more pending results, returning what we have 44109 1727204253.58514: results queue empty 44109 1727204253.58515: checking for any_errors_fatal 44109 1727204253.58520: done checking for any_errors_fatal 44109 1727204253.58521: checking for max_fail_percentage 44109 1727204253.58523: done checking for max_fail_percentage 44109 1727204253.58524: checking to see if all hosts have failed and the running result is not ok 44109 1727204253.58525: done checking to see if all hosts have failed 44109 1727204253.58526: getting the remaining hosts for this loop 44109 1727204253.58527: done getting the remaining hosts for this loop 44109 1727204253.58531: getting the next task for host managed-node1 44109 1727204253.58536: done getting next task for host managed-node1 44109 1727204253.58540: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44109 1727204253.58541: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204253.58551: getting variables 44109 1727204253.58552: in VariableManager get_vars() 44109 1727204253.58614: Calling all_inventory to load vars for managed-node1 44109 1727204253.58617: Calling groups_inventory to load vars for managed-node1 44109 1727204253.58619: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204253.58627: Calling all_plugins_play to load vars for managed-node1 44109 1727204253.58630: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204253.58632: Calling groups_plugins_play to load vars for managed-node1 44109 1727204253.59943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204253.61685: done with get_vars() 44109 1727204253.61711: done getting variables 44109 1727204253.61773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.057) 0:00:30.414 ***** 44109 1727204253.61809: entering _queue_task() for managed-node1/debug 44109 1727204253.62128: worker is 1 (out of 1 available) 44109 1727204253.62140: exiting _queue_task() for managed-node1/debug 44109 1727204253.62153: done queuing things up, now waiting for results queue to drain 44109 1727204253.62154: waiting for pending results... 44109 1727204253.62484: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44109 1727204253.62490: in run() - task 028d2410-947f-ed67-a560-000000000084 44109 1727204253.62564: variable 'ansible_search_path' from source: unknown 44109 1727204253.62568: variable 'ansible_search_path' from source: unknown 44109 1727204253.62571: calling self._execute() 44109 1727204253.62644: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.62650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.62672: variable 'omit' from source: magic vars 44109 1727204253.63059: variable 'ansible_distribution_major_version' from source: facts 44109 1727204253.63070: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204253.63085: variable 'omit' from source: magic vars 44109 1727204253.63249: variable 'omit' from source: magic vars 44109 1727204253.63252: variable 'omit' from source: magic vars 44109 1727204253.63255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204253.63257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204253.63267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204253.63288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204253.63300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204253.63340: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204253.63343: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.63355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.63441: Set connection var ansible_connection to ssh 44109 1727204253.63445: Set connection var ansible_timeout to 10 44109 1727204253.63450: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204253.63463: Set connection var ansible_pipelining to False 44109 1727204253.63467: Set connection var ansible_shell_executable to /bin/sh 44109 1727204253.63472: Set connection var ansible_shell_type to sh 44109 1727204253.63494: variable 'ansible_shell_executable' from source: unknown 44109 1727204253.63497: variable 'ansible_connection' from source: unknown 44109 1727204253.63500: variable 'ansible_module_compression' from source: unknown 44109 1727204253.63502: variable 'ansible_shell_type' from source: unknown 44109 1727204253.63504: variable 'ansible_shell_executable' from source: unknown 44109 1727204253.63507: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.63509: variable 'ansible_pipelining' from source: unknown 44109 1727204253.63514: variable 'ansible_timeout' from source: unknown 44109 1727204253.63516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.63626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204253.63634: variable 'omit' from source: magic vars 44109 1727204253.63638: starting attempt loop 44109 1727204253.63642: running the handler 44109 1727204253.63681: variable '__network_connections_result' from source: set_fact 44109 1727204253.63738: variable '__network_connections_result' from source: set_fact 44109 1727204253.63809: handler run complete 44109 1727204253.63828: attempt loop complete, returning result 44109 1727204253.63831: _execute() done 44109 1727204253.63834: dumping result to json 44109 1727204253.63836: done dumping result, returning 44109 1727204253.63844: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-ed67-a560-000000000084] 44109 1727204253.63847: sending task result for task 028d2410-947f-ed67-a560-000000000084 44109 1727204253.63936: done sending task result for task 028d2410-947f-ed67-a560-000000000084 44109 1727204253.63942: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 44109 1727204253.64017: no more pending results, returning what we have 44109 1727204253.64021: results queue empty 44109 1727204253.64021: checking for any_errors_fatal 44109 1727204253.64028: done checking for any_errors_fatal 44109 1727204253.64028: checking for max_fail_percentage 44109 1727204253.64030: done checking for max_fail_percentage 44109 1727204253.64031: checking to see if all hosts have failed and the running result is not ok 44109 1727204253.64031: done checking to see if all hosts have failed 44109 1727204253.64032: getting the remaining hosts for this loop 44109 1727204253.64034: done getting the remaining hosts for this loop 44109 1727204253.64037: getting the next task for host managed-node1 44109 1727204253.64042: done getting next task for host managed-node1 44109 1727204253.64045: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44109 1727204253.64047: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204253.64056: getting variables 44109 1727204253.64057: in VariableManager get_vars() 44109 1727204253.64091: Calling all_inventory to load vars for managed-node1 44109 1727204253.64094: Calling groups_inventory to load vars for managed-node1 44109 1727204253.64096: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204253.64104: Calling all_plugins_play to load vars for managed-node1 44109 1727204253.64107: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204253.64109: Calling groups_plugins_play to load vars for managed-node1 44109 1727204253.65722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204253.67681: done with get_vars() 44109 1727204253.67705: done getting variables 44109 1727204253.67767: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.059) 0:00:30.474 ***** 44109 1727204253.67803: entering _queue_task() for managed-node1/debug 44109 1727204253.68129: worker is 1 (out of 1 available) 44109 1727204253.68140: exiting _queue_task() for managed-node1/debug 44109 1727204253.68156: done queuing things up, now waiting for results queue to drain 44109 1727204253.68157: waiting for pending results... 44109 1727204253.68597: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44109 1727204253.68602: in run() - task 028d2410-947f-ed67-a560-000000000085 44109 1727204253.68605: variable 'ansible_search_path' from source: unknown 44109 1727204253.68608: variable 'ansible_search_path' from source: unknown 44109 1727204253.68640: calling self._execute() 44109 1727204253.68757: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.68768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.68784: variable 'omit' from source: magic vars 44109 1727204253.69170: variable 'ansible_distribution_major_version' from source: facts 44109 1727204253.69237: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204253.69317: variable 'network_state' from source: role '' defaults 44109 1727204253.69334: Evaluated conditional (network_state != {}): False 44109 1727204253.69350: when evaluation is False, skipping this task 44109 1727204253.69358: _execute() done 44109 1727204253.69366: dumping result to json 44109 1727204253.69373: done dumping result, returning 44109 1727204253.69387: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-ed67-a560-000000000085] 44109 1727204253.69396: sending task result for task 028d2410-947f-ed67-a560-000000000085 44109 1727204253.69671: done sending task result for task 028d2410-947f-ed67-a560-000000000085 44109 1727204253.69675: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 44109 1727204253.69722: no more pending results, returning what we have 44109 1727204253.69726: results queue empty 44109 1727204253.69727: checking for any_errors_fatal 44109 1727204253.69735: done checking for any_errors_fatal 44109 1727204253.69735: checking for max_fail_percentage 44109 1727204253.69737: done checking for max_fail_percentage 44109 1727204253.69738: checking to see if all hosts have failed and the running result is not ok 44109 1727204253.69739: done checking to see if all hosts have failed 44109 1727204253.69740: getting the remaining hosts for this loop 44109 1727204253.69741: done getting the remaining hosts for this loop 44109 1727204253.69744: getting the next task for host managed-node1 44109 1727204253.69749: done getting next task for host managed-node1 44109 1727204253.69753: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44109 1727204253.69755: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204253.69769: getting variables 44109 1727204253.69771: in VariableManager get_vars() 44109 1727204253.69814: Calling all_inventory to load vars for managed-node1 44109 1727204253.69817: Calling groups_inventory to load vars for managed-node1 44109 1727204253.69819: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204253.69829: Calling all_plugins_play to load vars for managed-node1 44109 1727204253.69831: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204253.69834: Calling groups_plugins_play to load vars for managed-node1 44109 1727204253.71342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204253.73016: done with get_vars() 44109 1727204253.73049: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.053) 0:00:30.527 ***** 44109 1727204253.73149: entering _queue_task() for managed-node1/ping 44109 1727204253.73518: worker is 1 (out of 1 available) 44109 1727204253.73531: exiting _queue_task() for managed-node1/ping 44109 1727204253.73543: done queuing things up, now waiting for results queue to drain 44109 1727204253.73544: waiting for pending results... 44109 1727204253.73899: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 44109 1727204253.73943: in run() - task 028d2410-947f-ed67-a560-000000000086 44109 1727204253.73962: variable 'ansible_search_path' from source: unknown 44109 1727204253.73969: variable 'ansible_search_path' from source: unknown 44109 1727204253.74018: calling self._execute() 44109 1727204253.74130: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.74140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.74153: variable 'omit' from source: magic vars 44109 1727204253.74548: variable 'ansible_distribution_major_version' from source: facts 44109 1727204253.74653: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204253.74656: variable 'omit' from source: magic vars 44109 1727204253.74658: variable 'omit' from source: magic vars 44109 1727204253.74660: variable 'omit' from source: magic vars 44109 1727204253.74707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204253.74747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204253.74783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204253.74805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204253.74822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204253.74854: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204253.74866: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.74881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.74990: Set connection var ansible_connection to ssh 44109 1727204253.75083: Set connection var ansible_timeout to 10 44109 1727204253.75086: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204253.75088: Set connection var ansible_pipelining to False 44109 1727204253.75090: Set connection var ansible_shell_executable to /bin/sh 44109 1727204253.75094: Set connection var ansible_shell_type to sh 44109 1727204253.75096: variable 'ansible_shell_executable' from source: unknown 44109 1727204253.75098: variable 'ansible_connection' from source: unknown 44109 1727204253.75100: variable 'ansible_module_compression' from source: unknown 44109 1727204253.75102: variable 'ansible_shell_type' from source: unknown 44109 1727204253.75104: variable 'ansible_shell_executable' from source: unknown 44109 1727204253.75107: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204253.75108: variable 'ansible_pipelining' from source: unknown 44109 1727204253.75111: variable 'ansible_timeout' from source: unknown 44109 1727204253.75112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204253.75334: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204253.75338: variable 'omit' from source: magic vars 44109 1727204253.75340: starting attempt loop 44109 1727204253.75342: running the handler 44109 1727204253.75343: _low_level_execute_command(): starting 44109 1727204253.75345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204253.76191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.76245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204253.76262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.76300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.76425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204253.78220: stdout chunk (state=3): >>>/root <<< 44109 1727204253.78378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204253.78382: stdout chunk (state=3): >>><<< 44109 1727204253.78384: stderr chunk (state=3): >>><<< 44109 1727204253.78501: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204253.78504: _low_level_execute_command(): starting 44109 1727204253.78507: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823 `" && echo ansible-tmp-1727204253.784092-46461-260081761136823="` echo /root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823 `" ) && sleep 0' 44109 1727204253.79057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204253.79069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204253.79086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204253.79104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204253.79193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.79234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204253.79252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.79273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.79386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204253.81467: stdout chunk (state=3): >>>ansible-tmp-1727204253.784092-46461-260081761136823=/root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823 <<< 44109 1727204253.81981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204253.81986: stdout chunk (state=3): >>><<< 44109 1727204253.81989: stderr chunk (state=3): >>><<< 44109 1727204253.81992: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204253.784092-46461-260081761136823=/root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204253.81995: variable 'ansible_module_compression' from source: unknown 44109 1727204253.81997: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44109 1727204253.82000: variable 'ansible_facts' from source: unknown 44109 1727204253.82150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/AnsiballZ_ping.py 44109 1727204253.82392: Sending initial data 44109 1727204253.82401: Sent initial data (152 bytes) 44109 1727204253.82925: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204253.82992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.83044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204253.83069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.83140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.83304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204253.85087: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204253.85117: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204253.85308: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp98wbgceb /root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/AnsiballZ_ping.py <<< 44109 1727204253.85311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/AnsiballZ_ping.py" <<< 44109 1727204253.85450: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp98wbgceb" to remote "/root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/AnsiballZ_ping.py" <<< 44109 1727204253.86537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204253.86612: stderr chunk (state=3): >>><<< 44109 1727204253.86627: stdout chunk (state=3): >>><<< 44109 1727204253.86690: done transferring module to remote 44109 1727204253.86709: _low_level_execute_command(): starting 44109 1727204253.86718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/ /root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/AnsiballZ_ping.py && sleep 0' 44109 1727204253.87366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204253.87382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204253.87403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204253.87493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.87523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204253.87540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.87561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.87817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204253.89916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204253.89924: stdout chunk (state=3): >>><<< 44109 1727204253.89927: stderr chunk (state=3): >>><<< 44109 1727204253.89929: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204253.89931: _low_level_execute_command(): starting 44109 1727204253.89934: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/AnsiballZ_ping.py && sleep 0' 44109 1727204253.90828: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204253.90925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204253.91015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204253.91221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204254.07679: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44109 1727204254.09298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204254.09302: stdout chunk (state=3): >>><<< 44109 1727204254.09305: stderr chunk (state=3): >>><<< 44109 1727204254.09307: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204254.09457: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204254.09461: _low_level_execute_command(): starting 44109 1727204254.09464: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204253.784092-46461-260081761136823/ > /dev/null 2>&1 && sleep 0' 44109 1727204254.10266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204254.10284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204254.10331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204254.10346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204254.10359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204254.10390: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204254.10459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204254.10479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204254.10501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204254.10607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204254.12665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204254.12681: stdout chunk (state=3): >>><<< 44109 1727204254.12700: stderr chunk (state=3): >>><<< 44109 1727204254.12883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204254.12933: handler run complete 44109 1727204254.12936: attempt loop complete, returning result 44109 1727204254.12938: _execute() done 44109 1727204254.12940: dumping result to json 44109 1727204254.12942: done dumping result, returning 44109 1727204254.12944: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-ed67-a560-000000000086] 44109 1727204254.12946: sending task result for task 028d2410-947f-ed67-a560-000000000086 ok: [managed-node1] => { "changed": false, "ping": "pong" } 44109 1727204254.13086: no more pending results, returning what we have 44109 1727204254.13090: results queue empty 44109 1727204254.13090: checking for any_errors_fatal 44109 1727204254.13097: done checking for any_errors_fatal 44109 1727204254.13098: checking for max_fail_percentage 44109 1727204254.13100: done checking for max_fail_percentage 44109 1727204254.13100: checking to see if all hosts have failed and the running result is not ok 44109 1727204254.13101: done checking to see if all hosts have failed 44109 1727204254.13108: getting the remaining hosts for this loop 44109 1727204254.13110: done getting the remaining hosts for this loop 44109 1727204254.13117: getting the next task for host managed-node1 44109 1727204254.13124: done getting next task for host managed-node1 44109 1727204254.13126: ^ task is: TASK: meta (role_complete) 44109 1727204254.13128: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204254.13139: done sending task result for task 028d2410-947f-ed67-a560-000000000086 44109 1727204254.13143: WORKER PROCESS EXITING 44109 1727204254.13223: getting variables 44109 1727204254.13225: in VariableManager get_vars() 44109 1727204254.13260: Calling all_inventory to load vars for managed-node1 44109 1727204254.13263: Calling groups_inventory to load vars for managed-node1 44109 1727204254.13265: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204254.13278: Calling all_plugins_play to load vars for managed-node1 44109 1727204254.13281: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204254.13284: Calling groups_plugins_play to load vars for managed-node1 44109 1727204254.15069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204254.16695: done with get_vars() 44109 1727204254.16726: done getting variables 44109 1727204254.16808: done queuing things up, now waiting for results queue to drain 44109 1727204254.16810: results queue empty 44109 1727204254.16811: checking for any_errors_fatal 44109 1727204254.16814: done checking for any_errors_fatal 44109 1727204254.16815: checking for max_fail_percentage 44109 1727204254.16816: done checking for max_fail_percentage 44109 1727204254.16816: checking to see if all hosts have failed and the running result is not ok 44109 1727204254.16817: done checking to see if all hosts have failed 44109 1727204254.16818: getting the remaining hosts for this loop 44109 1727204254.16819: done getting the remaining hosts for this loop 44109 1727204254.16822: getting the next task for host managed-node1 44109 1727204254.16825: done getting next task for host managed-node1 44109 1727204254.16827: ^ task is: TASK: meta (flush_handlers) 44109 1727204254.16828: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204254.16831: getting variables 44109 1727204254.16832: in VariableManager get_vars() 44109 1727204254.16844: Calling all_inventory to load vars for managed-node1 44109 1727204254.16847: Calling groups_inventory to load vars for managed-node1 44109 1727204254.16849: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204254.16854: Calling all_plugins_play to load vars for managed-node1 44109 1727204254.16856: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204254.16858: Calling groups_plugins_play to load vars for managed-node1 44109 1727204254.18012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204254.19935: done with get_vars() 44109 1727204254.19965: done getting variables 44109 1727204254.20114: in VariableManager get_vars() 44109 1727204254.20129: Calling all_inventory to load vars for managed-node1 44109 1727204254.20132: Calling groups_inventory to load vars for managed-node1 44109 1727204254.20134: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204254.20140: Calling all_plugins_play to load vars for managed-node1 44109 1727204254.20143: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204254.20145: Calling groups_plugins_play to load vars for managed-node1 44109 1727204254.21803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204254.24103: done with get_vars() 44109 1727204254.24140: done queuing things up, now waiting for results queue to drain 44109 1727204254.24143: results queue empty 44109 1727204254.24143: checking for any_errors_fatal 44109 1727204254.24145: done checking for any_errors_fatal 44109 1727204254.24146: checking for max_fail_percentage 44109 1727204254.24147: done checking for max_fail_percentage 44109 1727204254.24147: checking to see if all hosts have failed and the running result is not ok 44109 1727204254.24148: done checking to see if all hosts have failed 44109 1727204254.24149: getting the remaining hosts for this loop 44109 1727204254.24150: done getting the remaining hosts for this loop 44109 1727204254.24153: getting the next task for host managed-node1 44109 1727204254.24157: done getting next task for host managed-node1 44109 1727204254.24158: ^ task is: TASK: meta (flush_handlers) 44109 1727204254.24160: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204254.24163: getting variables 44109 1727204254.24164: in VariableManager get_vars() 44109 1727204254.24179: Calling all_inventory to load vars for managed-node1 44109 1727204254.24181: Calling groups_inventory to load vars for managed-node1 44109 1727204254.24183: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204254.24190: Calling all_plugins_play to load vars for managed-node1 44109 1727204254.24192: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204254.24195: Calling groups_plugins_play to load vars for managed-node1 44109 1727204254.25370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204254.27034: done with get_vars() 44109 1727204254.27057: done getting variables 44109 1727204254.27112: in VariableManager get_vars() 44109 1727204254.27125: Calling all_inventory to load vars for managed-node1 44109 1727204254.27127: Calling groups_inventory to load vars for managed-node1 44109 1727204254.27129: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204254.27134: Calling all_plugins_play to load vars for managed-node1 44109 1727204254.27137: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204254.27139: Calling groups_plugins_play to load vars for managed-node1 44109 1727204254.28261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204254.30869: done with get_vars() 44109 1727204254.31110: done queuing things up, now waiting for results queue to drain 44109 1727204254.31112: results queue empty 44109 1727204254.31113: checking for any_errors_fatal 44109 1727204254.31114: done checking for any_errors_fatal 44109 1727204254.31115: checking for max_fail_percentage 44109 1727204254.31116: done checking for max_fail_percentage 44109 1727204254.31117: checking to see if all hosts have failed and the running result is not ok 44109 1727204254.31118: done checking to see if all hosts have failed 44109 1727204254.31118: getting the remaining hosts for this loop 44109 1727204254.31119: done getting the remaining hosts for this loop 44109 1727204254.31122: getting the next task for host managed-node1 44109 1727204254.31125: done getting next task for host managed-node1 44109 1727204254.31126: ^ task is: None 44109 1727204254.31128: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204254.31129: done queuing things up, now waiting for results queue to drain 44109 1727204254.31130: results queue empty 44109 1727204254.31130: checking for any_errors_fatal 44109 1727204254.31131: done checking for any_errors_fatal 44109 1727204254.31132: checking for max_fail_percentage 44109 1727204254.31133: done checking for max_fail_percentage 44109 1727204254.31133: checking to see if all hosts have failed and the running result is not ok 44109 1727204254.31134: done checking to see if all hosts have failed 44109 1727204254.31135: getting the next task for host managed-node1 44109 1727204254.31137: done getting next task for host managed-node1 44109 1727204254.31138: ^ task is: None 44109 1727204254.31139: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204254.31187: in VariableManager get_vars() 44109 1727204254.31204: done with get_vars() 44109 1727204254.31209: in VariableManager get_vars() 44109 1727204254.31219: done with get_vars() 44109 1727204254.31223: variable 'omit' from source: magic vars 44109 1727204254.31253: in VariableManager get_vars() 44109 1727204254.31263: done with get_vars() 44109 1727204254.31490: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 44109 1727204254.31879: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44109 1727204254.31934: getting the remaining hosts for this loop 44109 1727204254.31936: done getting the remaining hosts for this loop 44109 1727204254.31938: getting the next task for host managed-node1 44109 1727204254.31941: done getting next task for host managed-node1 44109 1727204254.31944: ^ task is: TASK: Gathering Facts 44109 1727204254.31945: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204254.31947: getting variables 44109 1727204254.31948: in VariableManager get_vars() 44109 1727204254.31958: Calling all_inventory to load vars for managed-node1 44109 1727204254.31960: Calling groups_inventory to load vars for managed-node1 44109 1727204254.31962: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204254.31967: Calling all_plugins_play to load vars for managed-node1 44109 1727204254.31969: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204254.31972: Calling groups_plugins_play to load vars for managed-node1 44109 1727204254.34506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204254.38173: done with get_vars() 44109 1727204254.38204: done getting variables 44109 1727204254.38251: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:57:34 -0400 (0:00:00.651) 0:00:31.179 ***** 44109 1727204254.38279: entering _queue_task() for managed-node1/gather_facts 44109 1727204254.38612: worker is 1 (out of 1 available) 44109 1727204254.38623: exiting _queue_task() for managed-node1/gather_facts 44109 1727204254.38634: done queuing things up, now waiting for results queue to drain 44109 1727204254.38635: waiting for pending results... 44109 1727204254.38911: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44109 1727204254.39082: in run() - task 028d2410-947f-ed67-a560-00000000057e 44109 1727204254.39086: variable 'ansible_search_path' from source: unknown 44109 1727204254.39089: calling self._execute() 44109 1727204254.39180: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204254.39193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204254.39212: variable 'omit' from source: magic vars 44109 1727204254.39607: variable 'ansible_distribution_major_version' from source: facts 44109 1727204254.39624: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204254.39638: variable 'omit' from source: magic vars 44109 1727204254.39675: variable 'omit' from source: magic vars 44109 1727204254.39754: variable 'omit' from source: magic vars 44109 1727204254.39768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204254.39809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204254.39836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204254.39863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204254.39982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204254.39985: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204254.39989: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204254.39992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204254.40037: Set connection var ansible_connection to ssh 44109 1727204254.40047: Set connection var ansible_timeout to 10 44109 1727204254.40057: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204254.40068: Set connection var ansible_pipelining to False 44109 1727204254.40079: Set connection var ansible_shell_executable to /bin/sh 44109 1727204254.40089: Set connection var ansible_shell_type to sh 44109 1727204254.40119: variable 'ansible_shell_executable' from source: unknown 44109 1727204254.40127: variable 'ansible_connection' from source: unknown 44109 1727204254.40134: variable 'ansible_module_compression' from source: unknown 44109 1727204254.40141: variable 'ansible_shell_type' from source: unknown 44109 1727204254.40148: variable 'ansible_shell_executable' from source: unknown 44109 1727204254.40155: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204254.40162: variable 'ansible_pipelining' from source: unknown 44109 1727204254.40169: variable 'ansible_timeout' from source: unknown 44109 1727204254.40180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204254.40433: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204254.40436: variable 'omit' from source: magic vars 44109 1727204254.40439: starting attempt loop 44109 1727204254.40441: running the handler 44109 1727204254.40443: variable 'ansible_facts' from source: unknown 44109 1727204254.40444: _low_level_execute_command(): starting 44109 1727204254.40446: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204254.41160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204254.41189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204254.41296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204254.41702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204254.41799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204254.43662: stdout chunk (state=3): >>>/root <<< 44109 1727204254.43732: stdout chunk (state=3): >>><<< 44109 1727204254.43742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204254.43757: stderr chunk (state=3): >>><<< 44109 1727204254.43787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204254.44084: _low_level_execute_command(): starting 44109 1727204254.44088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379 `" && echo ansible-tmp-1727204254.4398458-46489-246489985520379="` echo /root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379 `" ) && sleep 0' 44109 1727204254.45236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204254.45253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204254.45272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204254.45515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204254.45529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204254.45633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204254.47752: stdout chunk (state=3): >>>ansible-tmp-1727204254.4398458-46489-246489985520379=/root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379 <<< 44109 1727204254.48014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204254.48051: stderr chunk (state=3): >>><<< 44109 1727204254.48087: stdout chunk (state=3): >>><<< 44109 1727204254.48117: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204254.4398458-46489-246489985520379=/root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204254.48381: variable 'ansible_module_compression' from source: unknown 44109 1727204254.48384: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44109 1727204254.48446: variable 'ansible_facts' from source: unknown 44109 1727204254.48857: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/AnsiballZ_setup.py 44109 1727204254.49111: Sending initial data 44109 1727204254.49114: Sent initial data (154 bytes) 44109 1727204254.49768: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204254.49786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204254.49855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204254.49908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204254.49929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204254.50106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204254.50202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204254.51957: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204254.52041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204254.52221: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpt4qlb740 /root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/AnsiballZ_setup.py <<< 44109 1727204254.52225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/AnsiballZ_setup.py" <<< 44109 1727204254.52282: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpt4qlb740" to remote "/root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/AnsiballZ_setup.py" <<< 44109 1727204254.55163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204254.55382: stderr chunk (state=3): >>><<< 44109 1727204254.55386: stdout chunk (state=3): >>><<< 44109 1727204254.55388: done transferring module to remote 44109 1727204254.55390: _low_level_execute_command(): starting 44109 1727204254.55392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/ /root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/AnsiballZ_setup.py && sleep 0' 44109 1727204254.56661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204254.56699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204254.56727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204254.56744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204254.56759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204254.56773: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204254.56817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204254.56837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204254.56849: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204254.56936: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204254.56965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204254.57081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204254.59087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204254.59100: stdout chunk (state=3): >>><<< 44109 1727204254.59151: stderr chunk (state=3): >>><<< 44109 1727204254.59174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204254.59250: _low_level_execute_command(): starting 44109 1727204254.59267: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/AnsiballZ_setup.py && sleep 0' 44109 1727204254.60534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204254.60594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204254.60801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204254.60837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204254.60859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204254.61045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204255.29609: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "m<<< 44109 1727204255.29622: stdout chunk (state=3): >>>icro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "34", "epoch": "1727204254", "epoch_int": "1727204254", "date": "2024-09-24", "time": "14:57:34", "iso8601_micro": "2024-09-24T18:57:34.907156Z", "iso8601": "2024-09-24T18:57:34Z", "iso8601_basic": "20240924T145734907156", "iso8601_basic_short": "20240924T145734", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.6171875, "5m": 0.5419921875, "15m": 0.31689453125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["ethtest0", "lo", "eth0", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "<<< 44109 1727204255.29641: stdout chunk (state=3): >>>::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:70:7d:ce:1e:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f470:7dff:fece:1e9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "96:5a:e0:79:e4:16", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::945a:e0ff:fe79:e416", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::f470:7dff:fece:1e9f", "fe80::945a:e0ff:fe79:e416"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5", "fe80::945a:e0ff:fe79:e416", "fe80::f470:7dff:fece:1e9f"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 29<<< 44109 1727204255.29659: stdout chunk (state=3): >>>18, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 613, "free": 2918}, "nocache": {"free": 3277, "used": 254}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 846, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261781118976, "block_size": 4096, "block_total": 65519099, "block_available": 63911406, "block_used": 1607693, "inode_total": 131070960, "inode_available": 131027257, "inode_used": 43703, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44109 1727204255.32029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204255.32061: stderr chunk (state=3): >>><<< 44109 1727204255.32065: stdout chunk (state=3): >>><<< 44109 1727204255.32105: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "34", "epoch": "1727204254", "epoch_int": "1727204254", "date": "2024-09-24", "time": "14:57:34", "iso8601_micro": "2024-09-24T18:57:34.907156Z", "iso8601": "2024-09-24T18:57:34Z", "iso8601_basic": "20240924T145734907156", "iso8601_basic_short": "20240924T145734", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.6171875, "5m": 0.5419921875, "15m": 0.31689453125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["ethtest0", "lo", "eth0", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "f6:70:7d:ce:1e:9f", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f470:7dff:fece:1e9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "96:5a:e0:79:e4:16", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::945a:e0ff:fe79:e416", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::f470:7dff:fece:1e9f", "fe80::945a:e0ff:fe79:e416"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5", "fe80::945a:e0ff:fe79:e416", "fe80::f470:7dff:fece:1e9f"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2918, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 613, "free": 2918}, "nocache": {"free": 3277, "used": 254}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 846, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261781118976, "block_size": 4096, "block_total": 65519099, "block_available": 63911406, "block_used": 1607693, "inode_total": 131070960, "inode_available": 131027257, "inode_used": 43703, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204255.32403: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204255.32424: _low_level_execute_command(): starting 44109 1727204255.32427: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204254.4398458-46489-246489985520379/ > /dev/null 2>&1 && sleep 0' 44109 1727204255.32887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204255.32890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204255.32894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204255.32896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204255.32900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204255.32949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204255.32952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204255.33043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204255.34994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204255.35019: stderr chunk (state=3): >>><<< 44109 1727204255.35023: stdout chunk (state=3): >>><<< 44109 1727204255.35036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204255.35044: handler run complete 44109 1727204255.35141: variable 'ansible_facts' from source: unknown 44109 1727204255.35217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204255.35424: variable 'ansible_facts' from source: unknown 44109 1727204255.35483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204255.35585: attempt loop complete, returning result 44109 1727204255.35588: _execute() done 44109 1727204255.35591: dumping result to json 44109 1727204255.35617: done dumping result, returning 44109 1727204255.35626: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-ed67-a560-00000000057e] 44109 1727204255.35629: sending task result for task 028d2410-947f-ed67-a560-00000000057e 44109 1727204255.36385: done sending task result for task 028d2410-947f-ed67-a560-00000000057e 44109 1727204255.36388: WORKER PROCESS EXITING ok: [managed-node1] 44109 1727204255.36716: no more pending results, returning what we have 44109 1727204255.36719: results queue empty 44109 1727204255.36720: checking for any_errors_fatal 44109 1727204255.36721: done checking for any_errors_fatal 44109 1727204255.36722: checking for max_fail_percentage 44109 1727204255.36724: done checking for max_fail_percentage 44109 1727204255.36724: checking to see if all hosts have failed and the running result is not ok 44109 1727204255.36725: done checking to see if all hosts have failed 44109 1727204255.36726: getting the remaining hosts for this loop 44109 1727204255.36727: done getting the remaining hosts for this loop 44109 1727204255.36731: getting the next task for host managed-node1 44109 1727204255.36736: done getting next task for host managed-node1 44109 1727204255.36738: ^ task is: TASK: meta (flush_handlers) 44109 1727204255.36740: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204255.36744: getting variables 44109 1727204255.36745: in VariableManager get_vars() 44109 1727204255.36768: Calling all_inventory to load vars for managed-node1 44109 1727204255.36770: Calling groups_inventory to load vars for managed-node1 44109 1727204255.36774: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204255.36785: Calling all_plugins_play to load vars for managed-node1 44109 1727204255.36788: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204255.36791: Calling groups_plugins_play to load vars for managed-node1 44109 1727204255.37669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204255.38639: done with get_vars() 44109 1727204255.38655: done getting variables 44109 1727204255.38707: in VariableManager get_vars() 44109 1727204255.38714: Calling all_inventory to load vars for managed-node1 44109 1727204255.38716: Calling groups_inventory to load vars for managed-node1 44109 1727204255.38717: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204255.38721: Calling all_plugins_play to load vars for managed-node1 44109 1727204255.38722: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204255.38724: Calling groups_plugins_play to load vars for managed-node1 44109 1727204255.39649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204255.41137: done with get_vars() 44109 1727204255.41169: done queuing things up, now waiting for results queue to drain 44109 1727204255.41171: results queue empty 44109 1727204255.41172: checking for any_errors_fatal 44109 1727204255.41178: done checking for any_errors_fatal 44109 1727204255.41179: checking for max_fail_percentage 44109 1727204255.41180: done checking for max_fail_percentage 44109 1727204255.41186: checking to see if all hosts have failed and the running result is not ok 44109 1727204255.41187: done checking to see if all hosts have failed 44109 1727204255.41188: getting the remaining hosts for this loop 44109 1727204255.41189: done getting the remaining hosts for this loop 44109 1727204255.41192: getting the next task for host managed-node1 44109 1727204255.41196: done getting next task for host managed-node1 44109 1727204255.41199: ^ task is: TASK: Include the task 'delete_interface.yml' 44109 1727204255.41200: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204255.41203: getting variables 44109 1727204255.41204: in VariableManager get_vars() 44109 1727204255.41213: Calling all_inventory to load vars for managed-node1 44109 1727204255.41216: Calling groups_inventory to load vars for managed-node1 44109 1727204255.41218: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204255.41223: Calling all_plugins_play to load vars for managed-node1 44109 1727204255.41225: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204255.41228: Calling groups_plugins_play to load vars for managed-node1 44109 1727204255.46541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204255.48168: done with get_vars() 44109 1727204255.48195: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:57:35 -0400 (0:00:01.099) 0:00:32.279 ***** 44109 1727204255.48272: entering _queue_task() for managed-node1/include_tasks 44109 1727204255.48683: worker is 1 (out of 1 available) 44109 1727204255.48696: exiting _queue_task() for managed-node1/include_tasks 44109 1727204255.48709: done queuing things up, now waiting for results queue to drain 44109 1727204255.48711: waiting for pending results... 44109 1727204255.48993: running TaskExecutor() for managed-node1/TASK: Include the task 'delete_interface.yml' 44109 1727204255.49131: in run() - task 028d2410-947f-ed67-a560-000000000089 44109 1727204255.49154: variable 'ansible_search_path' from source: unknown 44109 1727204255.49200: calling self._execute() 44109 1727204255.49303: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204255.49317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204255.49334: variable 'omit' from source: magic vars 44109 1727204255.49822: variable 'ansible_distribution_major_version' from source: facts 44109 1727204255.49938: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204255.49942: _execute() done 44109 1727204255.49944: dumping result to json 44109 1727204255.49947: done dumping result, returning 44109 1727204255.49949: done running TaskExecutor() for managed-node1/TASK: Include the task 'delete_interface.yml' [028d2410-947f-ed67-a560-000000000089] 44109 1727204255.49956: sending task result for task 028d2410-947f-ed67-a560-000000000089 44109 1727204255.50136: done sending task result for task 028d2410-947f-ed67-a560-000000000089 44109 1727204255.50139: WORKER PROCESS EXITING 44109 1727204255.50180: no more pending results, returning what we have 44109 1727204255.50186: in VariableManager get_vars() 44109 1727204255.50221: Calling all_inventory to load vars for managed-node1 44109 1727204255.50223: Calling groups_inventory to load vars for managed-node1 44109 1727204255.50226: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204255.50239: Calling all_plugins_play to load vars for managed-node1 44109 1727204255.50241: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204255.50243: Calling groups_plugins_play to load vars for managed-node1 44109 1727204255.51822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204255.53425: done with get_vars() 44109 1727204255.53447: variable 'ansible_search_path' from source: unknown 44109 1727204255.53462: we have included files to process 44109 1727204255.53463: generating all_blocks data 44109 1727204255.53464: done generating all_blocks data 44109 1727204255.53465: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44109 1727204255.53466: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44109 1727204255.53468: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44109 1727204255.53702: done processing included file 44109 1727204255.53704: iterating over new_blocks loaded from include file 44109 1727204255.53706: in VariableManager get_vars() 44109 1727204255.53717: done with get_vars() 44109 1727204255.53720: filtering new block on tags 44109 1727204255.53735: done filtering new block on tags 44109 1727204255.53737: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node1 44109 1727204255.53742: extending task lists for all hosts with included blocks 44109 1727204255.53772: done extending task lists 44109 1727204255.53773: done processing included files 44109 1727204255.53774: results queue empty 44109 1727204255.53777: checking for any_errors_fatal 44109 1727204255.53778: done checking for any_errors_fatal 44109 1727204255.53779: checking for max_fail_percentage 44109 1727204255.53780: done checking for max_fail_percentage 44109 1727204255.53781: checking to see if all hosts have failed and the running result is not ok 44109 1727204255.53782: done checking to see if all hosts have failed 44109 1727204255.53782: getting the remaining hosts for this loop 44109 1727204255.53783: done getting the remaining hosts for this loop 44109 1727204255.53786: getting the next task for host managed-node1 44109 1727204255.53790: done getting next task for host managed-node1 44109 1727204255.53792: ^ task is: TASK: Remove test interface if necessary 44109 1727204255.53794: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204255.53796: getting variables 44109 1727204255.53797: in VariableManager get_vars() 44109 1727204255.53805: Calling all_inventory to load vars for managed-node1 44109 1727204255.53808: Calling groups_inventory to load vars for managed-node1 44109 1727204255.53810: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204255.53815: Calling all_plugins_play to load vars for managed-node1 44109 1727204255.53817: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204255.53820: Calling groups_plugins_play to load vars for managed-node1 44109 1727204255.55013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204255.57622: done with get_vars() 44109 1727204255.57658: done getting variables 44109 1727204255.57719: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:57:35 -0400 (0:00:00.094) 0:00:32.373 ***** 44109 1727204255.57752: entering _queue_task() for managed-node1/command 44109 1727204255.58469: worker is 1 (out of 1 available) 44109 1727204255.58881: exiting _queue_task() for managed-node1/command 44109 1727204255.58893: done queuing things up, now waiting for results queue to drain 44109 1727204255.58893: waiting for pending results... 44109 1727204255.59305: running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary 44109 1727204255.59310: in run() - task 028d2410-947f-ed67-a560-00000000058f 44109 1727204255.59313: variable 'ansible_search_path' from source: unknown 44109 1727204255.59316: variable 'ansible_search_path' from source: unknown 44109 1727204255.59497: calling self._execute() 44109 1727204255.59685: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204255.59829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204255.59833: variable 'omit' from source: magic vars 44109 1727204255.60660: variable 'ansible_distribution_major_version' from source: facts 44109 1727204255.60713: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204255.60916: variable 'omit' from source: magic vars 44109 1727204255.60920: variable 'omit' from source: magic vars 44109 1727204255.61127: variable 'interface' from source: set_fact 44109 1727204255.61152: variable 'omit' from source: magic vars 44109 1727204255.61202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204255.61246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204255.61271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204255.61300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204255.61317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204255.61357: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204255.61367: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204255.61461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204255.61487: Set connection var ansible_connection to ssh 44109 1727204255.61497: Set connection var ansible_timeout to 10 44109 1727204255.61508: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204255.61519: Set connection var ansible_pipelining to False 44109 1727204255.61528: Set connection var ansible_shell_executable to /bin/sh 44109 1727204255.61536: Set connection var ansible_shell_type to sh 44109 1727204255.61563: variable 'ansible_shell_executable' from source: unknown 44109 1727204255.61576: variable 'ansible_connection' from source: unknown 44109 1727204255.61584: variable 'ansible_module_compression' from source: unknown 44109 1727204255.61590: variable 'ansible_shell_type' from source: unknown 44109 1727204255.61597: variable 'ansible_shell_executable' from source: unknown 44109 1727204255.61603: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204255.61610: variable 'ansible_pipelining' from source: unknown 44109 1727204255.61616: variable 'ansible_timeout' from source: unknown 44109 1727204255.61624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204255.61762: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204255.61781: variable 'omit' from source: magic vars 44109 1727204255.61807: starting attempt loop 44109 1727204255.61911: running the handler 44109 1727204255.61914: _low_level_execute_command(): starting 44109 1727204255.61916: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204255.62582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204255.62684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204255.62711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204255.62920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204255.64816: stdout chunk (state=3): >>>/root <<< 44109 1727204255.64884: stdout chunk (state=3): >>><<< 44109 1727204255.64893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204255.64991: stderr chunk (state=3): >>><<< 44109 1727204255.65017: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204255.65037: _low_level_execute_command(): starting 44109 1727204255.65048: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337 `" && echo ansible-tmp-1727204255.65024-46536-212883178938337="` echo /root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337 `" ) && sleep 0' 44109 1727204255.66299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204255.66416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204255.66436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204255.66601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204255.66728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204255.68819: stdout chunk (state=3): >>>ansible-tmp-1727204255.65024-46536-212883178938337=/root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337 <<< 44109 1727204255.68997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204255.69032: stderr chunk (state=3): >>><<< 44109 1727204255.69043: stdout chunk (state=3): >>><<< 44109 1727204255.69096: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204255.65024-46536-212883178938337=/root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204255.69138: variable 'ansible_module_compression' from source: unknown 44109 1727204255.69381: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204255.69387: variable 'ansible_facts' from source: unknown 44109 1727204255.69489: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/AnsiballZ_command.py 44109 1727204255.70085: Sending initial data 44109 1727204255.70096: Sent initial data (154 bytes) 44109 1727204255.71151: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204255.71193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204255.71377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204255.71463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204255.71544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204255.73505: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/AnsiballZ_command.py" <<< 44109 1727204255.73509: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpui7q9eh2 /root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/AnsiballZ_command.py <<< 44109 1727204255.73534: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpui7q9eh2" to remote "/root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/AnsiballZ_command.py" <<< 44109 1727204255.75073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204255.75079: stdout chunk (state=3): >>><<< 44109 1727204255.75082: stderr chunk (state=3): >>><<< 44109 1727204255.75084: done transferring module to remote 44109 1727204255.75086: _low_level_execute_command(): starting 44109 1727204255.75089: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/ /root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/AnsiballZ_command.py && sleep 0' 44109 1727204255.76285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204255.76382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204255.76514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204255.76596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204255.78561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204255.78719: stderr chunk (state=3): >>><<< 44109 1727204255.78723: stdout chunk (state=3): >>><<< 44109 1727204255.78742: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204255.78758: _low_level_execute_command(): starting 44109 1727204255.78942: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/AnsiballZ_command.py && sleep 0' 44109 1727204255.79996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204255.80097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204255.80282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204255.80316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204255.80339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204255.80474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204255.80558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204255.98148: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 14:57:35.967405", "end": "2024-09-24 14:57:35.978708", "delta": "0:00:00.011303", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204256.00966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204256.01376: stdout chunk (state=3): >>><<< 44109 1727204256.01380: stderr chunk (state=3): >>><<< 44109 1727204256.01382: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 14:57:35.967405", "end": "2024-09-24 14:57:35.978708", "delta": "0:00:00.011303", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204256.01386: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204256.01388: _low_level_execute_command(): starting 44109 1727204256.01390: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204255.65024-46536-212883178938337/ > /dev/null 2>&1 && sleep 0' 44109 1727204256.02656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204256.02896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204256.03002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204256.03015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204256.04998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204256.05038: stderr chunk (state=3): >>><<< 44109 1727204256.05048: stdout chunk (state=3): >>><<< 44109 1727204256.05072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204256.05481: handler run complete 44109 1727204256.05484: Evaluated conditional (False): False 44109 1727204256.05488: attempt loop complete, returning result 44109 1727204256.05491: _execute() done 44109 1727204256.05493: dumping result to json 44109 1727204256.05495: done dumping result, returning 44109 1727204256.05497: done running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary [028d2410-947f-ed67-a560-00000000058f] 44109 1727204256.05500: sending task result for task 028d2410-947f-ed67-a560-00000000058f 44109 1727204256.05575: done sending task result for task 028d2410-947f-ed67-a560-00000000058f 44109 1727204256.05581: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.011303", "end": "2024-09-24 14:57:35.978708", "rc": 0, "start": "2024-09-24 14:57:35.967405" } 44109 1727204256.05652: no more pending results, returning what we have 44109 1727204256.05657: results queue empty 44109 1727204256.05658: checking for any_errors_fatal 44109 1727204256.05659: done checking for any_errors_fatal 44109 1727204256.05660: checking for max_fail_percentage 44109 1727204256.05662: done checking for max_fail_percentage 44109 1727204256.05663: checking to see if all hosts have failed and the running result is not ok 44109 1727204256.05664: done checking to see if all hosts have failed 44109 1727204256.05665: getting the remaining hosts for this loop 44109 1727204256.05667: done getting the remaining hosts for this loop 44109 1727204256.05670: getting the next task for host managed-node1 44109 1727204256.05683: done getting next task for host managed-node1 44109 1727204256.05685: ^ task is: TASK: meta (flush_handlers) 44109 1727204256.05687: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204256.05692: getting variables 44109 1727204256.05694: in VariableManager get_vars() 44109 1727204256.05727: Calling all_inventory to load vars for managed-node1 44109 1727204256.05731: Calling groups_inventory to load vars for managed-node1 44109 1727204256.05735: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204256.05747: Calling all_plugins_play to load vars for managed-node1 44109 1727204256.05750: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204256.05754: Calling groups_plugins_play to load vars for managed-node1 44109 1727204256.08193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204256.09895: done with get_vars() 44109 1727204256.09924: done getting variables 44109 1727204256.09997: in VariableManager get_vars() 44109 1727204256.10007: Calling all_inventory to load vars for managed-node1 44109 1727204256.10009: Calling groups_inventory to load vars for managed-node1 44109 1727204256.10014: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204256.10020: Calling all_plugins_play to load vars for managed-node1 44109 1727204256.10022: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204256.10025: Calling groups_plugins_play to load vars for managed-node1 44109 1727204256.12302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204256.13609: done with get_vars() 44109 1727204256.13645: done queuing things up, now waiting for results queue to drain 44109 1727204256.13647: results queue empty 44109 1727204256.13655: checking for any_errors_fatal 44109 1727204256.13658: done checking for any_errors_fatal 44109 1727204256.13659: checking for max_fail_percentage 44109 1727204256.13660: done checking for max_fail_percentage 44109 1727204256.13660: checking to see if all hosts have failed and the running result is not ok 44109 1727204256.13661: done checking to see if all hosts have failed 44109 1727204256.13661: getting the remaining hosts for this loop 44109 1727204256.13662: done getting the remaining hosts for this loop 44109 1727204256.13664: getting the next task for host managed-node1 44109 1727204256.13667: done getting next task for host managed-node1 44109 1727204256.13668: ^ task is: TASK: meta (flush_handlers) 44109 1727204256.13669: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204256.13671: getting variables 44109 1727204256.13671: in VariableManager get_vars() 44109 1727204256.13680: Calling all_inventory to load vars for managed-node1 44109 1727204256.13682: Calling groups_inventory to load vars for managed-node1 44109 1727204256.13684: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204256.13688: Calling all_plugins_play to load vars for managed-node1 44109 1727204256.13690: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204256.13692: Calling groups_plugins_play to load vars for managed-node1 44109 1727204256.14935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204256.16479: done with get_vars() 44109 1727204256.16495: done getting variables 44109 1727204256.16532: in VariableManager get_vars() 44109 1727204256.16539: Calling all_inventory to load vars for managed-node1 44109 1727204256.16540: Calling groups_inventory to load vars for managed-node1 44109 1727204256.16542: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204256.16545: Calling all_plugins_play to load vars for managed-node1 44109 1727204256.16546: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204256.16548: Calling groups_plugins_play to load vars for managed-node1 44109 1727204256.17189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204256.18634: done with get_vars() 44109 1727204256.18659: done queuing things up, now waiting for results queue to drain 44109 1727204256.18662: results queue empty 44109 1727204256.18662: checking for any_errors_fatal 44109 1727204256.18664: done checking for any_errors_fatal 44109 1727204256.18664: checking for max_fail_percentage 44109 1727204256.18665: done checking for max_fail_percentage 44109 1727204256.18666: checking to see if all hosts have failed and the running result is not ok 44109 1727204256.18667: done checking to see if all hosts have failed 44109 1727204256.18667: getting the remaining hosts for this loop 44109 1727204256.18668: done getting the remaining hosts for this loop 44109 1727204256.18671: getting the next task for host managed-node1 44109 1727204256.18674: done getting next task for host managed-node1 44109 1727204256.18677: ^ task is: None 44109 1727204256.18678: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204256.18680: done queuing things up, now waiting for results queue to drain 44109 1727204256.18680: results queue empty 44109 1727204256.18681: checking for any_errors_fatal 44109 1727204256.18682: done checking for any_errors_fatal 44109 1727204256.18682: checking for max_fail_percentage 44109 1727204256.18683: done checking for max_fail_percentage 44109 1727204256.18684: checking to see if all hosts have failed and the running result is not ok 44109 1727204256.18684: done checking to see if all hosts have failed 44109 1727204256.18689: getting the next task for host managed-node1 44109 1727204256.18692: done getting next task for host managed-node1 44109 1727204256.18693: ^ task is: None 44109 1727204256.18694: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204256.18749: in VariableManager get_vars() 44109 1727204256.18768: done with get_vars() 44109 1727204256.18773: in VariableManager get_vars() 44109 1727204256.18786: done with get_vars() 44109 1727204256.18789: variable 'omit' from source: magic vars 44109 1727204256.19038: variable 'profile' from source: play vars 44109 1727204256.19142: in VariableManager get_vars() 44109 1727204256.19157: done with get_vars() 44109 1727204256.19381: variable 'omit' from source: magic vars 44109 1727204256.19445: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 44109 1727204256.20356: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44109 1727204256.20379: getting the remaining hosts for this loop 44109 1727204256.20380: done getting the remaining hosts for this loop 44109 1727204256.20383: getting the next task for host managed-node1 44109 1727204256.20386: done getting next task for host managed-node1 44109 1727204256.20387: ^ task is: TASK: Gathering Facts 44109 1727204256.20388: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204256.20390: getting variables 44109 1727204256.20390: in VariableManager get_vars() 44109 1727204256.20399: Calling all_inventory to load vars for managed-node1 44109 1727204256.20400: Calling groups_inventory to load vars for managed-node1 44109 1727204256.20401: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204256.20405: Calling all_plugins_play to load vars for managed-node1 44109 1727204256.20407: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204256.20408: Calling groups_plugins_play to load vars for managed-node1 44109 1727204256.21190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204256.22683: done with get_vars() 44109 1727204256.22712: done getting variables 44109 1727204256.22759: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:57:36 -0400 (0:00:00.652) 0:00:33.026 ***** 44109 1727204256.22992: entering _queue_task() for managed-node1/gather_facts 44109 1727204256.23626: worker is 1 (out of 1 available) 44109 1727204256.23658: exiting _queue_task() for managed-node1/gather_facts 44109 1727204256.23671: done queuing things up, now waiting for results queue to drain 44109 1727204256.23672: waiting for pending results... 44109 1727204256.23918: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44109 1727204256.23999: in run() - task 028d2410-947f-ed67-a560-00000000059d 44109 1727204256.24013: variable 'ansible_search_path' from source: unknown 44109 1727204256.24046: calling self._execute() 44109 1727204256.24126: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204256.24129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204256.24137: variable 'omit' from source: magic vars 44109 1727204256.24426: variable 'ansible_distribution_major_version' from source: facts 44109 1727204256.24436: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204256.24439: variable 'omit' from source: magic vars 44109 1727204256.24463: variable 'omit' from source: magic vars 44109 1727204256.24487: variable 'omit' from source: magic vars 44109 1727204256.24523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204256.24550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204256.24566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204256.24582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204256.24592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204256.24620: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204256.24623: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204256.24626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204256.24697: Set connection var ansible_connection to ssh 44109 1727204256.24700: Set connection var ansible_timeout to 10 44109 1727204256.24707: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204256.24714: Set connection var ansible_pipelining to False 44109 1727204256.24722: Set connection var ansible_shell_executable to /bin/sh 44109 1727204256.24724: Set connection var ansible_shell_type to sh 44109 1727204256.24753: variable 'ansible_shell_executable' from source: unknown 44109 1727204256.24757: variable 'ansible_connection' from source: unknown 44109 1727204256.24759: variable 'ansible_module_compression' from source: unknown 44109 1727204256.24762: variable 'ansible_shell_type' from source: unknown 44109 1727204256.24764: variable 'ansible_shell_executable' from source: unknown 44109 1727204256.24766: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204256.24770: variable 'ansible_pipelining' from source: unknown 44109 1727204256.24772: variable 'ansible_timeout' from source: unknown 44109 1727204256.24774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204256.24912: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204256.24923: variable 'omit' from source: magic vars 44109 1727204256.24926: starting attempt loop 44109 1727204256.24929: running the handler 44109 1727204256.24941: variable 'ansible_facts' from source: unknown 44109 1727204256.24961: _low_level_execute_command(): starting 44109 1727204256.24967: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204256.25509: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204256.25615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204256.25621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204256.25643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204256.25728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204256.27511: stdout chunk (state=3): >>>/root <<< 44109 1727204256.27606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204256.27635: stderr chunk (state=3): >>><<< 44109 1727204256.27639: stdout chunk (state=3): >>><<< 44109 1727204256.27660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204256.27673: _low_level_execute_command(): starting 44109 1727204256.27682: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954 `" && echo ansible-tmp-1727204256.2766085-46572-241416832393954="` echo /root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954 `" ) && sleep 0' 44109 1727204256.28117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204256.28122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204256.28125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44109 1727204256.28128: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204256.28137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204256.28181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204256.28184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204256.28271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204256.30354: stdout chunk (state=3): >>>ansible-tmp-1727204256.2766085-46572-241416832393954=/root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954 <<< 44109 1727204256.30457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204256.30489: stderr chunk (state=3): >>><<< 44109 1727204256.30492: stdout chunk (state=3): >>><<< 44109 1727204256.30512: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204256.2766085-46572-241416832393954=/root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204256.30539: variable 'ansible_module_compression' from source: unknown 44109 1727204256.30582: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44109 1727204256.30637: variable 'ansible_facts' from source: unknown 44109 1727204256.30769: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/AnsiballZ_setup.py 44109 1727204256.30873: Sending initial data 44109 1727204256.30879: Sent initial data (154 bytes) 44109 1727204256.31317: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204256.31321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204256.31323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204256.31326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204256.31334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204256.31372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204256.31398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204256.31473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204256.33211: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204256.33282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204256.33355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpsjgzthwm /root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/AnsiballZ_setup.py <<< 44109 1727204256.33362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/AnsiballZ_setup.py" <<< 44109 1727204256.33438: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpsjgzthwm" to remote "/root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/AnsiballZ_setup.py" <<< 44109 1727204256.33441: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/AnsiballZ_setup.py" <<< 44109 1727204256.34634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204256.34680: stderr chunk (state=3): >>><<< 44109 1727204256.34683: stdout chunk (state=3): >>><<< 44109 1727204256.34700: done transferring module to remote 44109 1727204256.34710: _low_level_execute_command(): starting 44109 1727204256.34716: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/ /root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/AnsiballZ_setup.py && sleep 0' 44109 1727204256.35178: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204256.35182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204256.35184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204256.35186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204256.35188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204256.35237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204256.35245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204256.35325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204256.37271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204256.37298: stderr chunk (state=3): >>><<< 44109 1727204256.37301: stdout chunk (state=3): >>><<< 44109 1727204256.37318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204256.37321: _low_level_execute_command(): starting 44109 1727204256.37326: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/AnsiballZ_setup.py && sleep 0' 44109 1727204256.37747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204256.37751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204256.37763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204256.37829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204256.37831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204256.37833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204256.37916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204257.03572: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2907, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 624, "free": 2907}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 847, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261781118976, "block_size": 4096, "block_total": 65519099, "block_available": 63911406, "block_used": 1607693, "inode_total": 131070960, "inode_available": 131027257, "inode_used": 43703, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "37", "epoch": "1727204257", "epoch_int": "1727204257", "date": "2024-09-24", "time": "14:57:37", "iso8601_micro": "2024-09-24T18:57:37.030634Z", "iso8601": "2024-09-24T18:57:37Z", "iso8601_basic": "20240924T145737030634", "iso8601_basic_short": "20240924T145737", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.6171875, "5m": 0.5419921875, "15m": 0.31689453125}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44109 1727204257.05804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204257.05834: stderr chunk (state=3): >>><<< 44109 1727204257.05837: stdout chunk (state=3): >>><<< 44109 1727204257.05862: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2907, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 624, "free": 2907}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 847, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261781118976, "block_size": 4096, "block_total": 65519099, "block_available": 63911406, "block_used": 1607693, "inode_total": 131070960, "inode_available": 131027257, "inode_used": 43703, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "37", "epoch": "1727204257", "epoch_int": "1727204257", "date": "2024-09-24", "time": "14:57:37", "iso8601_micro": "2024-09-24T18:57:37.030634Z", "iso8601": "2024-09-24T18:57:37Z", "iso8601_basic": "20240924T145737030634", "iso8601_basic_short": "20240924T145737", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.6171875, "5m": 0.5419921875, "15m": 0.31689453125}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204257.06092: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204257.06111: _low_level_execute_command(): starting 44109 1727204257.06118: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204256.2766085-46572-241416832393954/ > /dev/null 2>&1 && sleep 0' 44109 1727204257.06562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204257.06565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204257.06570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204257.06572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204257.06574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204257.06629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204257.06635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204257.06637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204257.06715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204257.08686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204257.08716: stderr chunk (state=3): >>><<< 44109 1727204257.08719: stdout chunk (state=3): >>><<< 44109 1727204257.08733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204257.08740: handler run complete 44109 1727204257.08823: variable 'ansible_facts' from source: unknown 44109 1727204257.08900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.09079: variable 'ansible_facts' from source: unknown 44109 1727204257.09134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.09205: attempt loop complete, returning result 44109 1727204257.09210: _execute() done 44109 1727204257.09215: dumping result to json 44109 1727204257.09233: done dumping result, returning 44109 1727204257.09243: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-ed67-a560-00000000059d] 44109 1727204257.09246: sending task result for task 028d2410-947f-ed67-a560-00000000059d 44109 1727204257.09531: done sending task result for task 028d2410-947f-ed67-a560-00000000059d 44109 1727204257.09534: WORKER PROCESS EXITING ok: [managed-node1] 44109 1727204257.09782: no more pending results, returning what we have 44109 1727204257.09786: results queue empty 44109 1727204257.09787: checking for any_errors_fatal 44109 1727204257.09789: done checking for any_errors_fatal 44109 1727204257.09789: checking for max_fail_percentage 44109 1727204257.09791: done checking for max_fail_percentage 44109 1727204257.09792: checking to see if all hosts have failed and the running result is not ok 44109 1727204257.09793: done checking to see if all hosts have failed 44109 1727204257.09793: getting the remaining hosts for this loop 44109 1727204257.09794: done getting the remaining hosts for this loop 44109 1727204257.09798: getting the next task for host managed-node1 44109 1727204257.09807: done getting next task for host managed-node1 44109 1727204257.09809: ^ task is: TASK: meta (flush_handlers) 44109 1727204257.09815: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204257.09819: getting variables 44109 1727204257.09821: in VariableManager get_vars() 44109 1727204257.09848: Calling all_inventory to load vars for managed-node1 44109 1727204257.09851: Calling groups_inventory to load vars for managed-node1 44109 1727204257.09853: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204257.09865: Calling all_plugins_play to load vars for managed-node1 44109 1727204257.09868: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204257.09871: Calling groups_plugins_play to load vars for managed-node1 44109 1727204257.11261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.12267: done with get_vars() 44109 1727204257.12286: done getting variables 44109 1727204257.12338: in VariableManager get_vars() 44109 1727204257.12348: Calling all_inventory to load vars for managed-node1 44109 1727204257.12349: Calling groups_inventory to load vars for managed-node1 44109 1727204257.12351: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204257.12354: Calling all_plugins_play to load vars for managed-node1 44109 1727204257.12355: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204257.12357: Calling groups_plugins_play to load vars for managed-node1 44109 1727204257.13173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.14694: done with get_vars() 44109 1727204257.14728: done queuing things up, now waiting for results queue to drain 44109 1727204257.14730: results queue empty 44109 1727204257.14730: checking for any_errors_fatal 44109 1727204257.14733: done checking for any_errors_fatal 44109 1727204257.14734: checking for max_fail_percentage 44109 1727204257.14738: done checking for max_fail_percentage 44109 1727204257.14739: checking to see if all hosts have failed and the running result is not ok 44109 1727204257.14739: done checking to see if all hosts have failed 44109 1727204257.14740: getting the remaining hosts for this loop 44109 1727204257.14740: done getting the remaining hosts for this loop 44109 1727204257.14742: getting the next task for host managed-node1 44109 1727204257.14745: done getting next task for host managed-node1 44109 1727204257.14748: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44109 1727204257.14749: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204257.14756: getting variables 44109 1727204257.14756: in VariableManager get_vars() 44109 1727204257.14767: Calling all_inventory to load vars for managed-node1 44109 1727204257.14768: Calling groups_inventory to load vars for managed-node1 44109 1727204257.14770: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204257.14773: Calling all_plugins_play to load vars for managed-node1 44109 1727204257.14775: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204257.14780: Calling groups_plugins_play to load vars for managed-node1 44109 1727204257.15486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.16352: done with get_vars() 44109 1727204257.16368: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.934) 0:00:33.960 ***** 44109 1727204257.16455: entering _queue_task() for managed-node1/include_tasks 44109 1727204257.16833: worker is 1 (out of 1 available) 44109 1727204257.16844: exiting _queue_task() for managed-node1/include_tasks 44109 1727204257.16856: done queuing things up, now waiting for results queue to drain 44109 1727204257.16857: waiting for pending results... 44109 1727204257.17297: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44109 1727204257.17302: in run() - task 028d2410-947f-ed67-a560-000000000091 44109 1727204257.17306: variable 'ansible_search_path' from source: unknown 44109 1727204257.17309: variable 'ansible_search_path' from source: unknown 44109 1727204257.17394: calling self._execute() 44109 1727204257.17450: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204257.17461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204257.17508: variable 'omit' from source: magic vars 44109 1727204257.17797: variable 'ansible_distribution_major_version' from source: facts 44109 1727204257.17806: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204257.17815: _execute() done 44109 1727204257.17818: dumping result to json 44109 1727204257.17821: done dumping result, returning 44109 1727204257.17824: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-ed67-a560-000000000091] 44109 1727204257.17828: sending task result for task 028d2410-947f-ed67-a560-000000000091 44109 1727204257.17923: done sending task result for task 028d2410-947f-ed67-a560-000000000091 44109 1727204257.17926: WORKER PROCESS EXITING 44109 1727204257.17974: no more pending results, returning what we have 44109 1727204257.17981: in VariableManager get_vars() 44109 1727204257.18031: Calling all_inventory to load vars for managed-node1 44109 1727204257.18034: Calling groups_inventory to load vars for managed-node1 44109 1727204257.18036: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204257.18047: Calling all_plugins_play to load vars for managed-node1 44109 1727204257.18049: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204257.18052: Calling groups_plugins_play to load vars for managed-node1 44109 1727204257.18873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.19766: done with get_vars() 44109 1727204257.19784: variable 'ansible_search_path' from source: unknown 44109 1727204257.19785: variable 'ansible_search_path' from source: unknown 44109 1727204257.19807: we have included files to process 44109 1727204257.19808: generating all_blocks data 44109 1727204257.19808: done generating all_blocks data 44109 1727204257.19809: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204257.19810: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204257.19811: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44109 1727204257.20199: done processing included file 44109 1727204257.20200: iterating over new_blocks loaded from include file 44109 1727204257.20201: in VariableManager get_vars() 44109 1727204257.20217: done with get_vars() 44109 1727204257.20218: filtering new block on tags 44109 1727204257.20228: done filtering new block on tags 44109 1727204257.20230: in VariableManager get_vars() 44109 1727204257.20242: done with get_vars() 44109 1727204257.20243: filtering new block on tags 44109 1727204257.20253: done filtering new block on tags 44109 1727204257.20255: in VariableManager get_vars() 44109 1727204257.20269: done with get_vars() 44109 1727204257.20270: filtering new block on tags 44109 1727204257.20281: done filtering new block on tags 44109 1727204257.20282: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 44109 1727204257.20286: extending task lists for all hosts with included blocks 44109 1727204257.20501: done extending task lists 44109 1727204257.20502: done processing included files 44109 1727204257.20503: results queue empty 44109 1727204257.20503: checking for any_errors_fatal 44109 1727204257.20504: done checking for any_errors_fatal 44109 1727204257.20505: checking for max_fail_percentage 44109 1727204257.20505: done checking for max_fail_percentage 44109 1727204257.20506: checking to see if all hosts have failed and the running result is not ok 44109 1727204257.20506: done checking to see if all hosts have failed 44109 1727204257.20507: getting the remaining hosts for this loop 44109 1727204257.20508: done getting the remaining hosts for this loop 44109 1727204257.20509: getting the next task for host managed-node1 44109 1727204257.20514: done getting next task for host managed-node1 44109 1727204257.20515: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44109 1727204257.20517: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204257.20523: getting variables 44109 1727204257.20524: in VariableManager get_vars() 44109 1727204257.20534: Calling all_inventory to load vars for managed-node1 44109 1727204257.20535: Calling groups_inventory to load vars for managed-node1 44109 1727204257.20536: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204257.20540: Calling all_plugins_play to load vars for managed-node1 44109 1727204257.20541: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204257.20543: Calling groups_plugins_play to load vars for managed-node1 44109 1727204257.21269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.22148: done with get_vars() 44109 1727204257.22165: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.057) 0:00:34.018 ***** 44109 1727204257.22224: entering _queue_task() for managed-node1/setup 44109 1727204257.22487: worker is 1 (out of 1 available) 44109 1727204257.22499: exiting _queue_task() for managed-node1/setup 44109 1727204257.22510: done queuing things up, now waiting for results queue to drain 44109 1727204257.22511: waiting for pending results... 44109 1727204257.22697: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44109 1727204257.22783: in run() - task 028d2410-947f-ed67-a560-0000000005de 44109 1727204257.22794: variable 'ansible_search_path' from source: unknown 44109 1727204257.22797: variable 'ansible_search_path' from source: unknown 44109 1727204257.22827: calling self._execute() 44109 1727204257.22901: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204257.22905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204257.22915: variable 'omit' from source: magic vars 44109 1727204257.23181: variable 'ansible_distribution_major_version' from source: facts 44109 1727204257.23193: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204257.23338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204257.24826: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204257.24872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204257.24902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204257.24929: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204257.24950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204257.25009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204257.25032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204257.25055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204257.25082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204257.25094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204257.25135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204257.25154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204257.25171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204257.25206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204257.25219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204257.25333: variable '__network_required_facts' from source: role '' defaults 44109 1727204257.25342: variable 'ansible_facts' from source: unknown 44109 1727204257.25803: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44109 1727204257.25806: when evaluation is False, skipping this task 44109 1727204257.25809: _execute() done 44109 1727204257.25811: dumping result to json 44109 1727204257.25813: done dumping result, returning 44109 1727204257.25823: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-ed67-a560-0000000005de] 44109 1727204257.25826: sending task result for task 028d2410-947f-ed67-a560-0000000005de 44109 1727204257.25910: done sending task result for task 028d2410-947f-ed67-a560-0000000005de 44109 1727204257.25913: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204257.25957: no more pending results, returning what we have 44109 1727204257.25961: results queue empty 44109 1727204257.25962: checking for any_errors_fatal 44109 1727204257.25963: done checking for any_errors_fatal 44109 1727204257.25964: checking for max_fail_percentage 44109 1727204257.25965: done checking for max_fail_percentage 44109 1727204257.25966: checking to see if all hosts have failed and the running result is not ok 44109 1727204257.25967: done checking to see if all hosts have failed 44109 1727204257.25968: getting the remaining hosts for this loop 44109 1727204257.25969: done getting the remaining hosts for this loop 44109 1727204257.25972: getting the next task for host managed-node1 44109 1727204257.25983: done getting next task for host managed-node1 44109 1727204257.25987: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44109 1727204257.25990: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204257.26004: getting variables 44109 1727204257.26005: in VariableManager get_vars() 44109 1727204257.26042: Calling all_inventory to load vars for managed-node1 44109 1727204257.26045: Calling groups_inventory to load vars for managed-node1 44109 1727204257.26047: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204257.26057: Calling all_plugins_play to load vars for managed-node1 44109 1727204257.26059: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204257.26062: Calling groups_plugins_play to load vars for managed-node1 44109 1727204257.26895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.27889: done with get_vars() 44109 1727204257.27908: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.057) 0:00:34.076 ***** 44109 1727204257.27982: entering _queue_task() for managed-node1/stat 44109 1727204257.28247: worker is 1 (out of 1 available) 44109 1727204257.28259: exiting _queue_task() for managed-node1/stat 44109 1727204257.28271: done queuing things up, now waiting for results queue to drain 44109 1727204257.28272: waiting for pending results... 44109 1727204257.28459: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 44109 1727204257.28549: in run() - task 028d2410-947f-ed67-a560-0000000005e0 44109 1727204257.28561: variable 'ansible_search_path' from source: unknown 44109 1727204257.28564: variable 'ansible_search_path' from source: unknown 44109 1727204257.28593: calling self._execute() 44109 1727204257.28671: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204257.28675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204257.28684: variable 'omit' from source: magic vars 44109 1727204257.28963: variable 'ansible_distribution_major_version' from source: facts 44109 1727204257.28971: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204257.29093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204257.29295: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204257.29329: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204257.29354: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204257.29384: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204257.29451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204257.29468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204257.29491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204257.29509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204257.29573: variable '__network_is_ostree' from source: set_fact 44109 1727204257.29580: Evaluated conditional (not __network_is_ostree is defined): False 44109 1727204257.29585: when evaluation is False, skipping this task 44109 1727204257.29588: _execute() done 44109 1727204257.29590: dumping result to json 44109 1727204257.29593: done dumping result, returning 44109 1727204257.29603: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-ed67-a560-0000000005e0] 44109 1727204257.29606: sending task result for task 028d2410-947f-ed67-a560-0000000005e0 44109 1727204257.29689: done sending task result for task 028d2410-947f-ed67-a560-0000000005e0 44109 1727204257.29691: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44109 1727204257.29748: no more pending results, returning what we have 44109 1727204257.29752: results queue empty 44109 1727204257.29752: checking for any_errors_fatal 44109 1727204257.29760: done checking for any_errors_fatal 44109 1727204257.29760: checking for max_fail_percentage 44109 1727204257.29762: done checking for max_fail_percentage 44109 1727204257.29763: checking to see if all hosts have failed and the running result is not ok 44109 1727204257.29764: done checking to see if all hosts have failed 44109 1727204257.29764: getting the remaining hosts for this loop 44109 1727204257.29765: done getting the remaining hosts for this loop 44109 1727204257.29769: getting the next task for host managed-node1 44109 1727204257.29775: done getting next task for host managed-node1 44109 1727204257.29780: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44109 1727204257.29782: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204257.29796: getting variables 44109 1727204257.29798: in VariableManager get_vars() 44109 1727204257.29833: Calling all_inventory to load vars for managed-node1 44109 1727204257.29836: Calling groups_inventory to load vars for managed-node1 44109 1727204257.29838: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204257.29847: Calling all_plugins_play to load vars for managed-node1 44109 1727204257.29850: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204257.29852: Calling groups_plugins_play to load vars for managed-node1 44109 1727204257.30661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.31553: done with get_vars() 44109 1727204257.31572: done getting variables 44109 1727204257.31618: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.036) 0:00:34.112 ***** 44109 1727204257.31645: entering _queue_task() for managed-node1/set_fact 44109 1727204257.31907: worker is 1 (out of 1 available) 44109 1727204257.31919: exiting _queue_task() for managed-node1/set_fact 44109 1727204257.31931: done queuing things up, now waiting for results queue to drain 44109 1727204257.31932: waiting for pending results... 44109 1727204257.32114: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44109 1727204257.32207: in run() - task 028d2410-947f-ed67-a560-0000000005e1 44109 1727204257.32222: variable 'ansible_search_path' from source: unknown 44109 1727204257.32226: variable 'ansible_search_path' from source: unknown 44109 1727204257.32253: calling self._execute() 44109 1727204257.32332: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204257.32335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204257.32344: variable 'omit' from source: magic vars 44109 1727204257.32620: variable 'ansible_distribution_major_version' from source: facts 44109 1727204257.32630: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204257.32765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204257.32961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204257.32997: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204257.33025: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204257.33052: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204257.33123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204257.33141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204257.33164: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204257.33185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204257.33248: variable '__network_is_ostree' from source: set_fact 44109 1727204257.33254: Evaluated conditional (not __network_is_ostree is defined): False 44109 1727204257.33257: when evaluation is False, skipping this task 44109 1727204257.33268: _execute() done 44109 1727204257.33272: dumping result to json 44109 1727204257.33274: done dumping result, returning 44109 1727204257.33280: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-ed67-a560-0000000005e1] 44109 1727204257.33282: sending task result for task 028d2410-947f-ed67-a560-0000000005e1 44109 1727204257.33360: done sending task result for task 028d2410-947f-ed67-a560-0000000005e1 44109 1727204257.33363: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44109 1727204257.33415: no more pending results, returning what we have 44109 1727204257.33419: results queue empty 44109 1727204257.33420: checking for any_errors_fatal 44109 1727204257.33427: done checking for any_errors_fatal 44109 1727204257.33428: checking for max_fail_percentage 44109 1727204257.33430: done checking for max_fail_percentage 44109 1727204257.33431: checking to see if all hosts have failed and the running result is not ok 44109 1727204257.33432: done checking to see if all hosts have failed 44109 1727204257.33433: getting the remaining hosts for this loop 44109 1727204257.33434: done getting the remaining hosts for this loop 44109 1727204257.33437: getting the next task for host managed-node1 44109 1727204257.33445: done getting next task for host managed-node1 44109 1727204257.33449: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44109 1727204257.33451: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204257.33465: getting variables 44109 1727204257.33467: in VariableManager get_vars() 44109 1727204257.33506: Calling all_inventory to load vars for managed-node1 44109 1727204257.33509: Calling groups_inventory to load vars for managed-node1 44109 1727204257.33511: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204257.33522: Calling all_plugins_play to load vars for managed-node1 44109 1727204257.33525: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204257.33528: Calling groups_plugins_play to load vars for managed-node1 44109 1727204257.34805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204257.35872: done with get_vars() 44109 1727204257.35896: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.043) 0:00:34.155 ***** 44109 1727204257.35967: entering _queue_task() for managed-node1/service_facts 44109 1727204257.36229: worker is 1 (out of 1 available) 44109 1727204257.36242: exiting _queue_task() for managed-node1/service_facts 44109 1727204257.36254: done queuing things up, now waiting for results queue to drain 44109 1727204257.36255: waiting for pending results... 44109 1727204257.36435: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 44109 1727204257.36518: in run() - task 028d2410-947f-ed67-a560-0000000005e3 44109 1727204257.36529: variable 'ansible_search_path' from source: unknown 44109 1727204257.36533: variable 'ansible_search_path' from source: unknown 44109 1727204257.36561: calling self._execute() 44109 1727204257.36642: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204257.36646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204257.36654: variable 'omit' from source: magic vars 44109 1727204257.36933: variable 'ansible_distribution_major_version' from source: facts 44109 1727204257.36942: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204257.36949: variable 'omit' from source: magic vars 44109 1727204257.37029: variable 'omit' from source: magic vars 44109 1727204257.37058: variable 'omit' from source: magic vars 44109 1727204257.37128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204257.37194: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204257.37388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204257.37391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204257.37393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204257.37395: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204257.37398: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204257.37400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204257.37403: Set connection var ansible_connection to ssh 44109 1727204257.37405: Set connection var ansible_timeout to 10 44109 1727204257.37407: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204257.37408: Set connection var ansible_pipelining to False 44109 1727204257.37488: Set connection var ansible_shell_executable to /bin/sh 44109 1727204257.37498: Set connection var ansible_shell_type to sh 44109 1727204257.37524: variable 'ansible_shell_executable' from source: unknown 44109 1727204257.37532: variable 'ansible_connection' from source: unknown 44109 1727204257.37540: variable 'ansible_module_compression' from source: unknown 44109 1727204257.37546: variable 'ansible_shell_type' from source: unknown 44109 1727204257.37552: variable 'ansible_shell_executable' from source: unknown 44109 1727204257.37558: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204257.37566: variable 'ansible_pipelining' from source: unknown 44109 1727204257.37573: variable 'ansible_timeout' from source: unknown 44109 1727204257.37584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204257.37771: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204257.37792: variable 'omit' from source: magic vars 44109 1727204257.37801: starting attempt loop 44109 1727204257.37807: running the handler 44109 1727204257.37826: _low_level_execute_command(): starting 44109 1727204257.37837: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204257.38602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204257.38635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204257.38660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204257.38683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204257.38795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204257.40584: stdout chunk (state=3): >>>/root <<< 44109 1727204257.40740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204257.40755: stdout chunk (state=3): >>><<< 44109 1727204257.40763: stderr chunk (state=3): >>><<< 44109 1727204257.40804: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204257.40905: _low_level_execute_command(): starting 44109 1727204257.40909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080 `" && echo ansible-tmp-1727204257.408075-46613-32596252086080="` echo /root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080 `" ) && sleep 0' 44109 1727204257.41486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204257.41502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204257.41519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204257.41554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204257.41574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204257.41660: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204257.41756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204257.41760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204257.41804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204257.41907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204257.44009: stdout chunk (state=3): >>>ansible-tmp-1727204257.408075-46613-32596252086080=/root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080 <<< 44109 1727204257.44159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204257.44173: stderr chunk (state=3): >>><<< 44109 1727204257.44189: stdout chunk (state=3): >>><<< 44109 1727204257.44383: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204257.408075-46613-32596252086080=/root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204257.44386: variable 'ansible_module_compression' from source: unknown 44109 1727204257.44389: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44109 1727204257.44391: variable 'ansible_facts' from source: unknown 44109 1727204257.44468: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/AnsiballZ_service_facts.py 44109 1727204257.44722: Sending initial data 44109 1727204257.44730: Sent initial data (160 bytes) 44109 1727204257.45394: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204257.45410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204257.45431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204257.45499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204257.45566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204257.45594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204257.45633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204257.45743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204257.47481: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204257.47573: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204257.47681: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpod6_ocfu /root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/AnsiballZ_service_facts.py <<< 44109 1727204257.47693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/AnsiballZ_service_facts.py" <<< 44109 1727204257.47756: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpod6_ocfu" to remote "/root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/AnsiballZ_service_facts.py" <<< 44109 1727204257.48782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204257.48786: stderr chunk (state=3): >>><<< 44109 1727204257.48788: stdout chunk (state=3): >>><<< 44109 1727204257.48790: done transferring module to remote 44109 1727204257.48793: _low_level_execute_command(): starting 44109 1727204257.48795: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/ /root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/AnsiballZ_service_facts.py && sleep 0' 44109 1727204257.49431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204257.49464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204257.49573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204257.49596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204257.49706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204257.51723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204257.51763: stdout chunk (state=3): >>><<< 44109 1727204257.51766: stderr chunk (state=3): >>><<< 44109 1727204257.51784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204257.51870: _low_level_execute_command(): starting 44109 1727204257.51874: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/AnsiballZ_service_facts.py && sleep 0' 44109 1727204257.52470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204257.52488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204257.52549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204257.52613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204257.52631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204257.52662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204257.52790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204259.28114: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 44109 1727204259.28184: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 44109 1727204259.28212: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44109 1727204259.29903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204259.29984: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 44109 1727204259.29987: stdout chunk (state=3): >>><<< 44109 1727204259.29990: stderr chunk (state=3): >>><<< 44109 1727204259.30190: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204259.30772: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204259.30788: _low_level_execute_command(): starting 44109 1727204259.30796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204257.408075-46613-32596252086080/ > /dev/null 2>&1 && sleep 0' 44109 1727204259.31429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204259.31443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204259.31458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204259.31480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204259.31507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204259.31599: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204259.31622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204259.31642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204259.31670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204259.31783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204259.33988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204259.33993: stdout chunk (state=3): >>><<< 44109 1727204259.33995: stderr chunk (state=3): >>><<< 44109 1727204259.33998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204259.34000: handler run complete 44109 1727204259.34057: variable 'ansible_facts' from source: unknown 44109 1727204259.34219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204259.34714: variable 'ansible_facts' from source: unknown 44109 1727204259.34848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204259.35090: attempt loop complete, returning result 44109 1727204259.35093: _execute() done 44109 1727204259.35095: dumping result to json 44109 1727204259.35136: done dumping result, returning 44109 1727204259.35149: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-ed67-a560-0000000005e3] 44109 1727204259.35159: sending task result for task 028d2410-947f-ed67-a560-0000000005e3 44109 1727204259.36307: done sending task result for task 028d2410-947f-ed67-a560-0000000005e3 44109 1727204259.36310: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204259.36385: no more pending results, returning what we have 44109 1727204259.36388: results queue empty 44109 1727204259.36389: checking for any_errors_fatal 44109 1727204259.36393: done checking for any_errors_fatal 44109 1727204259.36394: checking for max_fail_percentage 44109 1727204259.36396: done checking for max_fail_percentage 44109 1727204259.36397: checking to see if all hosts have failed and the running result is not ok 44109 1727204259.36398: done checking to see if all hosts have failed 44109 1727204259.36398: getting the remaining hosts for this loop 44109 1727204259.36400: done getting the remaining hosts for this loop 44109 1727204259.36403: getting the next task for host managed-node1 44109 1727204259.36408: done getting next task for host managed-node1 44109 1727204259.36411: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44109 1727204259.36414: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204259.36423: getting variables 44109 1727204259.36425: in VariableManager get_vars() 44109 1727204259.36455: Calling all_inventory to load vars for managed-node1 44109 1727204259.36458: Calling groups_inventory to load vars for managed-node1 44109 1727204259.36461: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204259.36470: Calling all_plugins_play to load vars for managed-node1 44109 1727204259.36473: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204259.36478: Calling groups_plugins_play to load vars for managed-node1 44109 1727204259.37811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204259.39122: done with get_vars() 44109 1727204259.39140: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:39 -0400 (0:00:02.032) 0:00:36.188 ***** 44109 1727204259.39211: entering _queue_task() for managed-node1/package_facts 44109 1727204259.39463: worker is 1 (out of 1 available) 44109 1727204259.39479: exiting _queue_task() for managed-node1/package_facts 44109 1727204259.39492: done queuing things up, now waiting for results queue to drain 44109 1727204259.39493: waiting for pending results... 44109 1727204259.39671: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 44109 1727204259.39761: in run() - task 028d2410-947f-ed67-a560-0000000005e4 44109 1727204259.39773: variable 'ansible_search_path' from source: unknown 44109 1727204259.39778: variable 'ansible_search_path' from source: unknown 44109 1727204259.39805: calling self._execute() 44109 1727204259.39883: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204259.39887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204259.39897: variable 'omit' from source: magic vars 44109 1727204259.40176: variable 'ansible_distribution_major_version' from source: facts 44109 1727204259.40187: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204259.40192: variable 'omit' from source: magic vars 44109 1727204259.40234: variable 'omit' from source: magic vars 44109 1727204259.40256: variable 'omit' from source: magic vars 44109 1727204259.40294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204259.40322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204259.40337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204259.40350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204259.40360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204259.40388: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204259.40392: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204259.40394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204259.40465: Set connection var ansible_connection to ssh 44109 1727204259.40468: Set connection var ansible_timeout to 10 44109 1727204259.40474: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204259.40483: Set connection var ansible_pipelining to False 44109 1727204259.40527: Set connection var ansible_shell_executable to /bin/sh 44109 1727204259.40530: Set connection var ansible_shell_type to sh 44109 1727204259.40533: variable 'ansible_shell_executable' from source: unknown 44109 1727204259.40535: variable 'ansible_connection' from source: unknown 44109 1727204259.40538: variable 'ansible_module_compression' from source: unknown 44109 1727204259.40541: variable 'ansible_shell_type' from source: unknown 44109 1727204259.40543: variable 'ansible_shell_executable' from source: unknown 44109 1727204259.40545: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204259.40547: variable 'ansible_pipelining' from source: unknown 44109 1727204259.40549: variable 'ansible_timeout' from source: unknown 44109 1727204259.40551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204259.40792: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204259.40796: variable 'omit' from source: magic vars 44109 1727204259.40799: starting attempt loop 44109 1727204259.40801: running the handler 44109 1727204259.40803: _low_level_execute_command(): starting 44109 1727204259.40812: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204259.41480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204259.41496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204259.41510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204259.41533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204259.41552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204259.41566: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204259.41643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204259.41657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204259.41688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204259.41762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204259.43530: stdout chunk (state=3): >>>/root <<< 44109 1727204259.43641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204259.43664: stderr chunk (state=3): >>><<< 44109 1727204259.43667: stdout chunk (state=3): >>><<< 44109 1727204259.43692: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204259.43759: _low_level_execute_command(): starting 44109 1727204259.43763: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945 `" && echo ansible-tmp-1727204259.4368825-46671-259246280671945="` echo /root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945 `" ) && sleep 0' 44109 1727204259.44399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204259.44403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204259.44429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204259.44445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204259.44546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204259.44550: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204259.44590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204259.44607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204259.44634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204259.44823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204259.46915: stdout chunk (state=3): >>>ansible-tmp-1727204259.4368825-46671-259246280671945=/root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945 <<< 44109 1727204259.47050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204259.47095: stderr chunk (state=3): >>><<< 44109 1727204259.47098: stdout chunk (state=3): >>><<< 44109 1727204259.47117: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204259.4368825-46671-259246280671945=/root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204259.47285: variable 'ansible_module_compression' from source: unknown 44109 1727204259.47288: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44109 1727204259.47297: variable 'ansible_facts' from source: unknown 44109 1727204259.47514: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/AnsiballZ_package_facts.py 44109 1727204259.47760: Sending initial data 44109 1727204259.47769: Sent initial data (162 bytes) 44109 1727204259.48391: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204259.48515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204259.48532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204259.48635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204259.50372: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204259.50468: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204259.50568: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpjxff1802 /root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/AnsiballZ_package_facts.py <<< 44109 1727204259.50571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/AnsiballZ_package_facts.py" <<< 44109 1727204259.50661: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpjxff1802" to remote "/root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/AnsiballZ_package_facts.py" <<< 44109 1727204259.52802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204259.52806: stdout chunk (state=3): >>><<< 44109 1727204259.52808: stderr chunk (state=3): >>><<< 44109 1727204259.52810: done transferring module to remote 44109 1727204259.52812: _low_level_execute_command(): starting 44109 1727204259.52815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/ /root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/AnsiballZ_package_facts.py && sleep 0' 44109 1727204259.53594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204259.53612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204259.53626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204259.53735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204259.55731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204259.55743: stdout chunk (state=3): >>><<< 44109 1727204259.55761: stderr chunk (state=3): >>><<< 44109 1727204259.55783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204259.55790: _low_level_execute_command(): starting 44109 1727204259.55799: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/AnsiballZ_package_facts.py && sleep 0' 44109 1727204259.56339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204259.56361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204259.56364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204259.56426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204259.56429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204259.56510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204260.03657: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 44109 1727204260.03728: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 44109 1727204260.03780: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 44109 1727204260.03834: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 44109 1727204260.03864: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44109 1727204260.06030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204260.06035: stdout chunk (state=3): >>><<< 44109 1727204260.06037: stderr chunk (state=3): >>><<< 44109 1727204260.06194: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204260.16285: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204260.16289: _low_level_execute_command(): starting 44109 1727204260.16292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204259.4368825-46671-259246280671945/ > /dev/null 2>&1 && sleep 0' 44109 1727204260.17683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204260.17800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204260.17831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204260.17995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204260.19990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204260.20028: stderr chunk (state=3): >>><<< 44109 1727204260.20032: stdout chunk (state=3): >>><<< 44109 1727204260.20048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204260.20052: handler run complete 44109 1727204260.21089: variable 'ansible_facts' from source: unknown 44109 1727204260.21453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.23555: variable 'ansible_facts' from source: unknown 44109 1727204260.23986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.25235: attempt loop complete, returning result 44109 1727204260.25250: _execute() done 44109 1727204260.25253: dumping result to json 44109 1727204260.25578: done dumping result, returning 44109 1727204260.25805: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-ed67-a560-0000000005e4] 44109 1727204260.25808: sending task result for task 028d2410-947f-ed67-a560-0000000005e4 44109 1727204260.35805: done sending task result for task 028d2410-947f-ed67-a560-0000000005e4 44109 1727204260.35808: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204260.35866: no more pending results, returning what we have 44109 1727204260.35868: results queue empty 44109 1727204260.35869: checking for any_errors_fatal 44109 1727204260.35872: done checking for any_errors_fatal 44109 1727204260.35873: checking for max_fail_percentage 44109 1727204260.35874: done checking for max_fail_percentage 44109 1727204260.35878: checking to see if all hosts have failed and the running result is not ok 44109 1727204260.35879: done checking to see if all hosts have failed 44109 1727204260.35879: getting the remaining hosts for this loop 44109 1727204260.35881: done getting the remaining hosts for this loop 44109 1727204260.35883: getting the next task for host managed-node1 44109 1727204260.35888: done getting next task for host managed-node1 44109 1727204260.35891: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44109 1727204260.35893: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204260.35907: getting variables 44109 1727204260.35909: in VariableManager get_vars() 44109 1727204260.35931: Calling all_inventory to load vars for managed-node1 44109 1727204260.35933: Calling groups_inventory to load vars for managed-node1 44109 1727204260.35935: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204260.35941: Calling all_plugins_play to load vars for managed-node1 44109 1727204260.35944: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204260.35946: Calling groups_plugins_play to load vars for managed-node1 44109 1727204260.37255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.38986: done with get_vars() 44109 1727204260.39008: done getting variables 44109 1727204260.39062: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.998) 0:00:37.187 ***** 44109 1727204260.39092: entering _queue_task() for managed-node1/debug 44109 1727204260.39459: worker is 1 (out of 1 available) 44109 1727204260.39470: exiting _queue_task() for managed-node1/debug 44109 1727204260.39485: done queuing things up, now waiting for results queue to drain 44109 1727204260.39486: waiting for pending results... 44109 1727204260.39767: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 44109 1727204260.39888: in run() - task 028d2410-947f-ed67-a560-000000000092 44109 1727204260.39918: variable 'ansible_search_path' from source: unknown 44109 1727204260.39927: variable 'ansible_search_path' from source: unknown 44109 1727204260.39968: calling self._execute() 44109 1727204260.40074: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.40088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.40119: variable 'omit' from source: magic vars 44109 1727204260.40497: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.40553: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204260.40557: variable 'omit' from source: magic vars 44109 1727204260.40579: variable 'omit' from source: magic vars 44109 1727204260.40685: variable 'network_provider' from source: set_fact 44109 1727204260.40710: variable 'omit' from source: magic vars 44109 1727204260.40760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204260.40881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204260.40884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204260.40886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204260.40889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204260.40902: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204260.40909: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.40920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.41034: Set connection var ansible_connection to ssh 44109 1727204260.41046: Set connection var ansible_timeout to 10 44109 1727204260.41058: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204260.41072: Set connection var ansible_pipelining to False 44109 1727204260.41084: Set connection var ansible_shell_executable to /bin/sh 44109 1727204260.41098: Set connection var ansible_shell_type to sh 44109 1727204260.41128: variable 'ansible_shell_executable' from source: unknown 44109 1727204260.41136: variable 'ansible_connection' from source: unknown 44109 1727204260.41142: variable 'ansible_module_compression' from source: unknown 44109 1727204260.41148: variable 'ansible_shell_type' from source: unknown 44109 1727204260.41154: variable 'ansible_shell_executable' from source: unknown 44109 1727204260.41160: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.41205: variable 'ansible_pipelining' from source: unknown 44109 1727204260.41209: variable 'ansible_timeout' from source: unknown 44109 1727204260.41211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.41340: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204260.41358: variable 'omit' from source: magic vars 44109 1727204260.41367: starting attempt loop 44109 1727204260.41373: running the handler 44109 1727204260.41532: handler run complete 44109 1727204260.41535: attempt loop complete, returning result 44109 1727204260.41538: _execute() done 44109 1727204260.41540: dumping result to json 44109 1727204260.41542: done dumping result, returning 44109 1727204260.41544: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-ed67-a560-000000000092] 44109 1727204260.41546: sending task result for task 028d2410-947f-ed67-a560-000000000092 44109 1727204260.41627: done sending task result for task 028d2410-947f-ed67-a560-000000000092 44109 1727204260.41630: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 44109 1727204260.41691: no more pending results, returning what we have 44109 1727204260.41695: results queue empty 44109 1727204260.41696: checking for any_errors_fatal 44109 1727204260.41706: done checking for any_errors_fatal 44109 1727204260.41707: checking for max_fail_percentage 44109 1727204260.41709: done checking for max_fail_percentage 44109 1727204260.41709: checking to see if all hosts have failed and the running result is not ok 44109 1727204260.41710: done checking to see if all hosts have failed 44109 1727204260.41711: getting the remaining hosts for this loop 44109 1727204260.41712: done getting the remaining hosts for this loop 44109 1727204260.41716: getting the next task for host managed-node1 44109 1727204260.41722: done getting next task for host managed-node1 44109 1727204260.41726: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44109 1727204260.41727: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204260.41737: getting variables 44109 1727204260.41738: in VariableManager get_vars() 44109 1727204260.41774: Calling all_inventory to load vars for managed-node1 44109 1727204260.41779: Calling groups_inventory to load vars for managed-node1 44109 1727204260.41781: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204260.41790: Calling all_plugins_play to load vars for managed-node1 44109 1727204260.41793: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204260.41795: Calling groups_plugins_play to load vars for managed-node1 44109 1727204260.43284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.45022: done with get_vars() 44109 1727204260.45051: done getting variables 44109 1727204260.45126: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.060) 0:00:37.247 ***** 44109 1727204260.45159: entering _queue_task() for managed-node1/fail 44109 1727204260.45550: worker is 1 (out of 1 available) 44109 1727204260.45563: exiting _queue_task() for managed-node1/fail 44109 1727204260.45574: done queuing things up, now waiting for results queue to drain 44109 1727204260.45579: waiting for pending results... 44109 1727204260.45880: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44109 1727204260.46084: in run() - task 028d2410-947f-ed67-a560-000000000093 44109 1727204260.46088: variable 'ansible_search_path' from source: unknown 44109 1727204260.46090: variable 'ansible_search_path' from source: unknown 44109 1727204260.46093: calling self._execute() 44109 1727204260.46155: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.46166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.46186: variable 'omit' from source: magic vars 44109 1727204260.46554: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.46572: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204260.46702: variable 'network_state' from source: role '' defaults 44109 1727204260.46718: Evaluated conditional (network_state != {}): False 44109 1727204260.46736: when evaluation is False, skipping this task 44109 1727204260.46739: _execute() done 44109 1727204260.46742: dumping result to json 44109 1727204260.46846: done dumping result, returning 44109 1727204260.46850: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-ed67-a560-000000000093] 44109 1727204260.46853: sending task result for task 028d2410-947f-ed67-a560-000000000093 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204260.46971: no more pending results, returning what we have 44109 1727204260.46977: results queue empty 44109 1727204260.46978: checking for any_errors_fatal 44109 1727204260.46986: done checking for any_errors_fatal 44109 1727204260.46987: checking for max_fail_percentage 44109 1727204260.46988: done checking for max_fail_percentage 44109 1727204260.46989: checking to see if all hosts have failed and the running result is not ok 44109 1727204260.46990: done checking to see if all hosts have failed 44109 1727204260.46991: getting the remaining hosts for this loop 44109 1727204260.46992: done getting the remaining hosts for this loop 44109 1727204260.46995: getting the next task for host managed-node1 44109 1727204260.47002: done getting next task for host managed-node1 44109 1727204260.47006: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44109 1727204260.47008: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204260.47023: getting variables 44109 1727204260.47025: in VariableManager get_vars() 44109 1727204260.47063: Calling all_inventory to load vars for managed-node1 44109 1727204260.47066: Calling groups_inventory to load vars for managed-node1 44109 1727204260.47068: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204260.47188: Calling all_plugins_play to load vars for managed-node1 44109 1727204260.47192: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204260.47197: Calling groups_plugins_play to load vars for managed-node1 44109 1727204260.47891: done sending task result for task 028d2410-947f-ed67-a560-000000000093 44109 1727204260.47895: WORKER PROCESS EXITING 44109 1727204260.48915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.50537: done with get_vars() 44109 1727204260.50568: done getting variables 44109 1727204260.50639: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.055) 0:00:37.303 ***** 44109 1727204260.50672: entering _queue_task() for managed-node1/fail 44109 1727204260.51118: worker is 1 (out of 1 available) 44109 1727204260.51131: exiting _queue_task() for managed-node1/fail 44109 1727204260.51142: done queuing things up, now waiting for results queue to drain 44109 1727204260.51143: waiting for pending results... 44109 1727204260.51417: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44109 1727204260.51541: in run() - task 028d2410-947f-ed67-a560-000000000094 44109 1727204260.51563: variable 'ansible_search_path' from source: unknown 44109 1727204260.51572: variable 'ansible_search_path' from source: unknown 44109 1727204260.51623: calling self._execute() 44109 1727204260.51736: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.51748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.51765: variable 'omit' from source: magic vars 44109 1727204260.52155: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.52171: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204260.52291: variable 'network_state' from source: role '' defaults 44109 1727204260.52307: Evaluated conditional (network_state != {}): False 44109 1727204260.52316: when evaluation is False, skipping this task 44109 1727204260.52323: _execute() done 44109 1727204260.52330: dumping result to json 44109 1727204260.52337: done dumping result, returning 44109 1727204260.52347: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-ed67-a560-000000000094] 44109 1727204260.52356: sending task result for task 028d2410-947f-ed67-a560-000000000094 44109 1727204260.52466: done sending task result for task 028d2410-947f-ed67-a560-000000000094 44109 1727204260.52479: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204260.52529: no more pending results, returning what we have 44109 1727204260.52534: results queue empty 44109 1727204260.52535: checking for any_errors_fatal 44109 1727204260.52543: done checking for any_errors_fatal 44109 1727204260.52544: checking for max_fail_percentage 44109 1727204260.52547: done checking for max_fail_percentage 44109 1727204260.52548: checking to see if all hosts have failed and the running result is not ok 44109 1727204260.52549: done checking to see if all hosts have failed 44109 1727204260.52549: getting the remaining hosts for this loop 44109 1727204260.52551: done getting the remaining hosts for this loop 44109 1727204260.52554: getting the next task for host managed-node1 44109 1727204260.52562: done getting next task for host managed-node1 44109 1727204260.52566: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44109 1727204260.52568: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204260.52680: getting variables 44109 1727204260.52683: in VariableManager get_vars() 44109 1727204260.52725: Calling all_inventory to load vars for managed-node1 44109 1727204260.52728: Calling groups_inventory to load vars for managed-node1 44109 1727204260.52730: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204260.52742: Calling all_plugins_play to load vars for managed-node1 44109 1727204260.52745: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204260.52747: Calling groups_plugins_play to load vars for managed-node1 44109 1727204260.54363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.56016: done with get_vars() 44109 1727204260.56045: done getting variables 44109 1727204260.56117: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.054) 0:00:37.357 ***** 44109 1727204260.56151: entering _queue_task() for managed-node1/fail 44109 1727204260.56517: worker is 1 (out of 1 available) 44109 1727204260.56645: exiting _queue_task() for managed-node1/fail 44109 1727204260.56655: done queuing things up, now waiting for results queue to drain 44109 1727204260.56656: waiting for pending results... 44109 1727204260.56961: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44109 1727204260.57017: in run() - task 028d2410-947f-ed67-a560-000000000095 44109 1727204260.57041: variable 'ansible_search_path' from source: unknown 44109 1727204260.57049: variable 'ansible_search_path' from source: unknown 44109 1727204260.57101: calling self._execute() 44109 1727204260.57214: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.57226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.57242: variable 'omit' from source: magic vars 44109 1727204260.57650: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.57668: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204260.57947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204260.60419: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204260.60496: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204260.60536: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204260.60584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204260.60617: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204260.60704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.60741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.60778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.60821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.60840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.60940: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.60961: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44109 1727204260.61086: variable 'ansible_distribution' from source: facts 44109 1727204260.61181: variable '__network_rh_distros' from source: role '' defaults 44109 1727204260.61184: Evaluated conditional (ansible_distribution in __network_rh_distros): True 44109 1727204260.61370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.61424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.61453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.61507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.61528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.61580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.61617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.61646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.61691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.61719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.61764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.61797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.61980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.61983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.61985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.62172: variable 'network_connections' from source: play vars 44109 1727204260.62190: variable 'profile' from source: play vars 44109 1727204260.62267: variable 'profile' from source: play vars 44109 1727204260.62279: variable 'interface' from source: set_fact 44109 1727204260.62350: variable 'interface' from source: set_fact 44109 1727204260.62367: variable 'network_state' from source: role '' defaults 44109 1727204260.62446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204260.62603: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204260.62647: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204260.62682: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204260.62716: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204260.62768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204260.62863: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204260.62867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.62869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204260.62890: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 44109 1727204260.62898: when evaluation is False, skipping this task 44109 1727204260.62905: _execute() done 44109 1727204260.62911: dumping result to json 44109 1727204260.62918: done dumping result, returning 44109 1727204260.62929: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-ed67-a560-000000000095] 44109 1727204260.62937: sending task result for task 028d2410-947f-ed67-a560-000000000095 skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 44109 1727204260.63126: no more pending results, returning what we have 44109 1727204260.63130: results queue empty 44109 1727204260.63131: checking for any_errors_fatal 44109 1727204260.63138: done checking for any_errors_fatal 44109 1727204260.63138: checking for max_fail_percentage 44109 1727204260.63140: done checking for max_fail_percentage 44109 1727204260.63142: checking to see if all hosts have failed and the running result is not ok 44109 1727204260.63142: done checking to see if all hosts have failed 44109 1727204260.63143: getting the remaining hosts for this loop 44109 1727204260.63144: done getting the remaining hosts for this loop 44109 1727204260.63148: getting the next task for host managed-node1 44109 1727204260.63154: done getting next task for host managed-node1 44109 1727204260.63158: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44109 1727204260.63160: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204260.63173: getting variables 44109 1727204260.63174: in VariableManager get_vars() 44109 1727204260.63414: Calling all_inventory to load vars for managed-node1 44109 1727204260.63418: Calling groups_inventory to load vars for managed-node1 44109 1727204260.63420: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204260.63429: Calling all_plugins_play to load vars for managed-node1 44109 1727204260.63432: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204260.63435: Calling groups_plugins_play to load vars for managed-node1 44109 1727204260.63991: done sending task result for task 028d2410-947f-ed67-a560-000000000095 44109 1727204260.63994: WORKER PROCESS EXITING 44109 1727204260.65131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.67198: done with get_vars() 44109 1727204260.67251: done getting variables 44109 1727204260.67351: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.112) 0:00:37.470 ***** 44109 1727204260.67394: entering _queue_task() for managed-node1/dnf 44109 1727204260.67803: worker is 1 (out of 1 available) 44109 1727204260.67818: exiting _queue_task() for managed-node1/dnf 44109 1727204260.67830: done queuing things up, now waiting for results queue to drain 44109 1727204260.67831: waiting for pending results... 44109 1727204260.68068: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44109 1727204260.68189: in run() - task 028d2410-947f-ed67-a560-000000000096 44109 1727204260.68211: variable 'ansible_search_path' from source: unknown 44109 1727204260.68219: variable 'ansible_search_path' from source: unknown 44109 1727204260.68262: calling self._execute() 44109 1727204260.68367: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.68383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.68401: variable 'omit' from source: magic vars 44109 1727204260.68782: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.68801: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204260.69000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204260.71110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204260.71182: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204260.71221: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204260.71263: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204260.71297: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204260.71372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.71426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.71454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.71500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.71519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.71630: variable 'ansible_distribution' from source: facts 44109 1727204260.71639: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.71658: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44109 1727204260.71772: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204260.71900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.71928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.71956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.71999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.72017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.72181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.72185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.72187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.72189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.72191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.72193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.72222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.72250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.72292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.72308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.72452: variable 'network_connections' from source: play vars 44109 1727204260.72471: variable 'profile' from source: play vars 44109 1727204260.72537: variable 'profile' from source: play vars 44109 1727204260.72547: variable 'interface' from source: set_fact 44109 1727204260.72608: variable 'interface' from source: set_fact 44109 1727204260.72680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204260.72846: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204260.72891: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204260.72925: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204260.72957: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204260.73004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204260.73030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204260.73068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.73102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204260.73157: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204260.73579: variable 'network_connections' from source: play vars 44109 1727204260.73583: variable 'profile' from source: play vars 44109 1727204260.73585: variable 'profile' from source: play vars 44109 1727204260.73588: variable 'interface' from source: set_fact 44109 1727204260.73589: variable 'interface' from source: set_fact 44109 1727204260.73591: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204260.73593: when evaluation is False, skipping this task 44109 1727204260.73595: _execute() done 44109 1727204260.73597: dumping result to json 44109 1727204260.73599: done dumping result, returning 44109 1727204260.73601: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-ed67-a560-000000000096] 44109 1727204260.73603: sending task result for task 028d2410-947f-ed67-a560-000000000096 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204260.73742: no more pending results, returning what we have 44109 1727204260.73747: results queue empty 44109 1727204260.73748: checking for any_errors_fatal 44109 1727204260.73757: done checking for any_errors_fatal 44109 1727204260.73758: checking for max_fail_percentage 44109 1727204260.73760: done checking for max_fail_percentage 44109 1727204260.73761: checking to see if all hosts have failed and the running result is not ok 44109 1727204260.73762: done checking to see if all hosts have failed 44109 1727204260.73763: getting the remaining hosts for this loop 44109 1727204260.73764: done getting the remaining hosts for this loop 44109 1727204260.73768: getting the next task for host managed-node1 44109 1727204260.73774: done getting next task for host managed-node1 44109 1727204260.73780: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44109 1727204260.73782: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204260.73796: getting variables 44109 1727204260.73798: in VariableManager get_vars() 44109 1727204260.73838: Calling all_inventory to load vars for managed-node1 44109 1727204260.73842: Calling groups_inventory to load vars for managed-node1 44109 1727204260.73844: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204260.73855: Calling all_plugins_play to load vars for managed-node1 44109 1727204260.73858: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204260.73862: Calling groups_plugins_play to load vars for managed-node1 44109 1727204260.74791: done sending task result for task 028d2410-947f-ed67-a560-000000000096 44109 1727204260.74795: WORKER PROCESS EXITING 44109 1727204260.75561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.77125: done with get_vars() 44109 1727204260.77151: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44109 1727204260.77230: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.098) 0:00:37.568 ***** 44109 1727204260.77260: entering _queue_task() for managed-node1/yum 44109 1727204260.77799: worker is 1 (out of 1 available) 44109 1727204260.77809: exiting _queue_task() for managed-node1/yum 44109 1727204260.77817: done queuing things up, now waiting for results queue to drain 44109 1727204260.77818: waiting for pending results... 44109 1727204260.77914: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44109 1727204260.78023: in run() - task 028d2410-947f-ed67-a560-000000000097 44109 1727204260.78050: variable 'ansible_search_path' from source: unknown 44109 1727204260.78058: variable 'ansible_search_path' from source: unknown 44109 1727204260.78101: calling self._execute() 44109 1727204260.78436: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.78682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.78685: variable 'omit' from source: magic vars 44109 1727204260.79103: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.79217: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204260.79517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204260.84792: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204260.84863: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204260.84909: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204260.84947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204260.85027: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204260.85220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204260.85256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204260.85308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204260.85474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204260.85496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204260.85599: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.85774: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44109 1727204260.85784: when evaluation is False, skipping this task 44109 1727204260.85791: _execute() done 44109 1727204260.85799: dumping result to json 44109 1727204260.85806: done dumping result, returning 44109 1727204260.85817: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-ed67-a560-000000000097] 44109 1727204260.85826: sending task result for task 028d2410-947f-ed67-a560-000000000097 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44109 1727204260.85981: no more pending results, returning what we have 44109 1727204260.85985: results queue empty 44109 1727204260.85986: checking for any_errors_fatal 44109 1727204260.85991: done checking for any_errors_fatal 44109 1727204260.85992: checking for max_fail_percentage 44109 1727204260.85993: done checking for max_fail_percentage 44109 1727204260.85994: checking to see if all hosts have failed and the running result is not ok 44109 1727204260.85995: done checking to see if all hosts have failed 44109 1727204260.85996: getting the remaining hosts for this loop 44109 1727204260.85997: done getting the remaining hosts for this loop 44109 1727204260.86001: getting the next task for host managed-node1 44109 1727204260.86008: done getting next task for host managed-node1 44109 1727204260.86011: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44109 1727204260.86013: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204260.86027: getting variables 44109 1727204260.86028: in VariableManager get_vars() 44109 1727204260.86069: Calling all_inventory to load vars for managed-node1 44109 1727204260.86072: Calling groups_inventory to load vars for managed-node1 44109 1727204260.86075: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204260.86087: Calling all_plugins_play to load vars for managed-node1 44109 1727204260.86090: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204260.86092: Calling groups_plugins_play to load vars for managed-node1 44109 1727204260.87792: done sending task result for task 028d2410-947f-ed67-a560-000000000097 44109 1727204260.87796: WORKER PROCESS EXITING 44109 1727204260.90693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204260.95199: done with get_vars() 44109 1727204260.95235: done getting variables 44109 1727204260.95610: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.183) 0:00:37.752 ***** 44109 1727204260.95642: entering _queue_task() for managed-node1/fail 44109 1727204260.96613: worker is 1 (out of 1 available) 44109 1727204260.96623: exiting _queue_task() for managed-node1/fail 44109 1727204260.96633: done queuing things up, now waiting for results queue to drain 44109 1727204260.96634: waiting for pending results... 44109 1727204260.96746: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44109 1727204260.97069: in run() - task 028d2410-947f-ed67-a560-000000000098 44109 1727204260.97097: variable 'ansible_search_path' from source: unknown 44109 1727204260.97161: variable 'ansible_search_path' from source: unknown 44109 1727204260.97210: calling self._execute() 44109 1727204260.97318: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204260.97330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204260.97346: variable 'omit' from source: magic vars 44109 1727204260.97870: variable 'ansible_distribution_major_version' from source: facts 44109 1727204260.97890: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204260.98019: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204260.98224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204261.00422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204261.00552: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204261.00556: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204261.00585: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204261.00617: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204261.00707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.00758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.00795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.00841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.00865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.00982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.00985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.00988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.01018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.01037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.01082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.01116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.01145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.01188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.01212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.01392: variable 'network_connections' from source: play vars 44109 1727204261.01409: variable 'profile' from source: play vars 44109 1727204261.01580: variable 'profile' from source: play vars 44109 1727204261.01584: variable 'interface' from source: set_fact 44109 1727204261.01586: variable 'interface' from source: set_fact 44109 1727204261.01635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204261.01802: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204261.01850: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204261.01885: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204261.01925: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204261.01971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204261.02001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204261.02034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.02066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204261.02117: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204261.02360: variable 'network_connections' from source: play vars 44109 1727204261.02371: variable 'profile' from source: play vars 44109 1727204261.02439: variable 'profile' from source: play vars 44109 1727204261.02448: variable 'interface' from source: set_fact 44109 1727204261.02519: variable 'interface' from source: set_fact 44109 1727204261.02550: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204261.02558: when evaluation is False, skipping this task 44109 1727204261.02571: _execute() done 44109 1727204261.02682: dumping result to json 44109 1727204261.02685: done dumping result, returning 44109 1727204261.02688: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-ed67-a560-000000000098] 44109 1727204261.02697: sending task result for task 028d2410-947f-ed67-a560-000000000098 44109 1727204261.02769: done sending task result for task 028d2410-947f-ed67-a560-000000000098 44109 1727204261.02772: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204261.02832: no more pending results, returning what we have 44109 1727204261.02836: results queue empty 44109 1727204261.02837: checking for any_errors_fatal 44109 1727204261.02845: done checking for any_errors_fatal 44109 1727204261.02846: checking for max_fail_percentage 44109 1727204261.02848: done checking for max_fail_percentage 44109 1727204261.02849: checking to see if all hosts have failed and the running result is not ok 44109 1727204261.02849: done checking to see if all hosts have failed 44109 1727204261.02850: getting the remaining hosts for this loop 44109 1727204261.02851: done getting the remaining hosts for this loop 44109 1727204261.02854: getting the next task for host managed-node1 44109 1727204261.02860: done getting next task for host managed-node1 44109 1727204261.02864: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44109 1727204261.02865: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204261.02880: getting variables 44109 1727204261.02881: in VariableManager get_vars() 44109 1727204261.02917: Calling all_inventory to load vars for managed-node1 44109 1727204261.02919: Calling groups_inventory to load vars for managed-node1 44109 1727204261.02921: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204261.02931: Calling all_plugins_play to load vars for managed-node1 44109 1727204261.02933: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204261.02936: Calling groups_plugins_play to load vars for managed-node1 44109 1727204261.05749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204261.09314: done with get_vars() 44109 1727204261.09339: done getting variables 44109 1727204261.09403: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.137) 0:00:37.890 ***** 44109 1727204261.09436: entering _queue_task() for managed-node1/package 44109 1727204261.10195: worker is 1 (out of 1 available) 44109 1727204261.10207: exiting _queue_task() for managed-node1/package 44109 1727204261.10218: done queuing things up, now waiting for results queue to drain 44109 1727204261.10219: waiting for pending results... 44109 1727204261.11035: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 44109 1727204261.11040: in run() - task 028d2410-947f-ed67-a560-000000000099 44109 1727204261.11043: variable 'ansible_search_path' from source: unknown 44109 1727204261.11046: variable 'ansible_search_path' from source: unknown 44109 1727204261.11251: calling self._execute() 44109 1727204261.11469: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204261.11482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204261.11494: variable 'omit' from source: magic vars 44109 1727204261.12180: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.12380: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204261.12590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204261.13151: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204261.13319: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204261.13357: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204261.13556: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204261.13757: variable 'network_packages' from source: role '' defaults 44109 1727204261.14140: variable '__network_provider_setup' from source: role '' defaults 44109 1727204261.14144: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204261.14146: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204261.14179: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204261.14477: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204261.14773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204261.16994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204261.17059: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204261.17107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204261.17177: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204261.17180: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204261.17257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.17295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.17327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.17390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.17393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.17441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.17499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.17502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.17544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.17564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.17800: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44109 1727204261.17918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.18041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.18045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.18047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.18050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.18129: variable 'ansible_python' from source: facts 44109 1727204261.18165: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44109 1727204261.18250: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204261.18337: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204261.18487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.18516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.18546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.18594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.18612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.18658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.18702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.18732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.18774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.18803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.18947: variable 'network_connections' from source: play vars 44109 1727204261.18958: variable 'profile' from source: play vars 44109 1727204261.19127: variable 'profile' from source: play vars 44109 1727204261.19131: variable 'interface' from source: set_fact 44109 1727204261.19150: variable 'interface' from source: set_fact 44109 1727204261.19224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204261.19260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204261.19298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.19334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204261.19393: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204261.19981: variable 'network_connections' from source: play vars 44109 1727204261.19985: variable 'profile' from source: play vars 44109 1727204261.20108: variable 'profile' from source: play vars 44109 1727204261.20121: variable 'interface' from source: set_fact 44109 1727204261.20324: variable 'interface' from source: set_fact 44109 1727204261.20336: variable '__network_packages_default_wireless' from source: role '' defaults 44109 1727204261.20468: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204261.20945: variable 'network_connections' from source: play vars 44109 1727204261.20954: variable 'profile' from source: play vars 44109 1727204261.21032: variable 'profile' from source: play vars 44109 1727204261.21040: variable 'interface' from source: set_fact 44109 1727204261.21146: variable 'interface' from source: set_fact 44109 1727204261.21184: variable '__network_packages_default_team' from source: role '' defaults 44109 1727204261.21271: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204261.21596: variable 'network_connections' from source: play vars 44109 1727204261.21607: variable 'profile' from source: play vars 44109 1727204261.21674: variable 'profile' from source: play vars 44109 1727204261.21772: variable 'interface' from source: set_fact 44109 1727204261.21795: variable 'interface' from source: set_fact 44109 1727204261.21881: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204261.21924: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204261.21937: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204261.22006: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204261.22230: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44109 1727204261.22764: variable 'network_connections' from source: play vars 44109 1727204261.22768: variable 'profile' from source: play vars 44109 1727204261.22804: variable 'profile' from source: play vars 44109 1727204261.22818: variable 'interface' from source: set_fact 44109 1727204261.22936: variable 'interface' from source: set_fact 44109 1727204261.22950: variable 'ansible_distribution' from source: facts 44109 1727204261.22982: variable '__network_rh_distros' from source: role '' defaults 44109 1727204261.22985: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.22994: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44109 1727204261.23199: variable 'ansible_distribution' from source: facts 44109 1727204261.23482: variable '__network_rh_distros' from source: role '' defaults 44109 1727204261.23485: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.23487: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44109 1727204261.23608: variable 'ansible_distribution' from source: facts 44109 1727204261.23618: variable '__network_rh_distros' from source: role '' defaults 44109 1727204261.23630: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.23718: variable 'network_provider' from source: set_fact 44109 1727204261.23741: variable 'ansible_facts' from source: unknown 44109 1727204261.24867: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44109 1727204261.24877: when evaluation is False, skipping this task 44109 1727204261.24886: _execute() done 44109 1727204261.24894: dumping result to json 44109 1727204261.24972: done dumping result, returning 44109 1727204261.24977: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-ed67-a560-000000000099] 44109 1727204261.24980: sending task result for task 028d2410-947f-ed67-a560-000000000099 44109 1727204261.25050: done sending task result for task 028d2410-947f-ed67-a560-000000000099 44109 1727204261.25053: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44109 1727204261.25125: no more pending results, returning what we have 44109 1727204261.25129: results queue empty 44109 1727204261.25130: checking for any_errors_fatal 44109 1727204261.25138: done checking for any_errors_fatal 44109 1727204261.25139: checking for max_fail_percentage 44109 1727204261.25141: done checking for max_fail_percentage 44109 1727204261.25142: checking to see if all hosts have failed and the running result is not ok 44109 1727204261.25142: done checking to see if all hosts have failed 44109 1727204261.25143: getting the remaining hosts for this loop 44109 1727204261.25144: done getting the remaining hosts for this loop 44109 1727204261.25148: getting the next task for host managed-node1 44109 1727204261.25154: done getting next task for host managed-node1 44109 1727204261.25157: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44109 1727204261.25159: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204261.25172: getting variables 44109 1727204261.25173: in VariableManager get_vars() 44109 1727204261.25212: Calling all_inventory to load vars for managed-node1 44109 1727204261.25215: Calling groups_inventory to load vars for managed-node1 44109 1727204261.25217: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204261.25231: Calling all_plugins_play to load vars for managed-node1 44109 1727204261.25234: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204261.25237: Calling groups_plugins_play to load vars for managed-node1 44109 1727204261.26964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204261.29655: done with get_vars() 44109 1727204261.29704: done getting variables 44109 1727204261.29765: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.203) 0:00:38.094 ***** 44109 1727204261.29801: entering _queue_task() for managed-node1/package 44109 1727204261.30157: worker is 1 (out of 1 available) 44109 1727204261.30171: exiting _queue_task() for managed-node1/package 44109 1727204261.30187: done queuing things up, now waiting for results queue to drain 44109 1727204261.30189: waiting for pending results... 44109 1727204261.30583: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44109 1727204261.30656: in run() - task 028d2410-947f-ed67-a560-00000000009a 44109 1727204261.30659: variable 'ansible_search_path' from source: unknown 44109 1727204261.30662: variable 'ansible_search_path' from source: unknown 44109 1727204261.30684: calling self._execute() 44109 1727204261.30853: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204261.30857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204261.30859: variable 'omit' from source: magic vars 44109 1727204261.31274: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.31295: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204261.31574: variable 'network_state' from source: role '' defaults 44109 1727204261.31584: Evaluated conditional (network_state != {}): False 44109 1727204261.31606: when evaluation is False, skipping this task 44109 1727204261.31609: _execute() done 44109 1727204261.31660: dumping result to json 44109 1727204261.31663: done dumping result, returning 44109 1727204261.31666: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-ed67-a560-00000000009a] 44109 1727204261.31669: sending task result for task 028d2410-947f-ed67-a560-00000000009a 44109 1727204261.32061: done sending task result for task 028d2410-947f-ed67-a560-00000000009a 44109 1727204261.32064: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204261.32119: no more pending results, returning what we have 44109 1727204261.32124: results queue empty 44109 1727204261.32126: checking for any_errors_fatal 44109 1727204261.32133: done checking for any_errors_fatal 44109 1727204261.32134: checking for max_fail_percentage 44109 1727204261.32136: done checking for max_fail_percentage 44109 1727204261.32137: checking to see if all hosts have failed and the running result is not ok 44109 1727204261.32138: done checking to see if all hosts have failed 44109 1727204261.32138: getting the remaining hosts for this loop 44109 1727204261.32140: done getting the remaining hosts for this loop 44109 1727204261.32143: getting the next task for host managed-node1 44109 1727204261.32149: done getting next task for host managed-node1 44109 1727204261.32153: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44109 1727204261.32156: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204261.32171: getting variables 44109 1727204261.32176: in VariableManager get_vars() 44109 1727204261.32216: Calling all_inventory to load vars for managed-node1 44109 1727204261.32219: Calling groups_inventory to load vars for managed-node1 44109 1727204261.32222: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204261.32233: Calling all_plugins_play to load vars for managed-node1 44109 1727204261.32236: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204261.32240: Calling groups_plugins_play to load vars for managed-node1 44109 1727204261.34921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204261.36577: done with get_vars() 44109 1727204261.36607: done getting variables 44109 1727204261.36667: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.068) 0:00:38.163 ***** 44109 1727204261.36698: entering _queue_task() for managed-node1/package 44109 1727204261.37052: worker is 1 (out of 1 available) 44109 1727204261.37065: exiting _queue_task() for managed-node1/package 44109 1727204261.37279: done queuing things up, now waiting for results queue to drain 44109 1727204261.37281: waiting for pending results... 44109 1727204261.37494: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44109 1727204261.37498: in run() - task 028d2410-947f-ed67-a560-00000000009b 44109 1727204261.37523: variable 'ansible_search_path' from source: unknown 44109 1727204261.37532: variable 'ansible_search_path' from source: unknown 44109 1727204261.37571: calling self._execute() 44109 1727204261.37679: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204261.37693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204261.37708: variable 'omit' from source: magic vars 44109 1727204261.38589: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.38596: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204261.38763: variable 'network_state' from source: role '' defaults 44109 1727204261.38781: Evaluated conditional (network_state != {}): False 44109 1727204261.38819: when evaluation is False, skipping this task 44109 1727204261.38828: _execute() done 44109 1727204261.38836: dumping result to json 44109 1727204261.38843: done dumping result, returning 44109 1727204261.38923: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-ed67-a560-00000000009b] 44109 1727204261.38927: sending task result for task 028d2410-947f-ed67-a560-00000000009b skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204261.39274: no more pending results, returning what we have 44109 1727204261.39281: results queue empty 44109 1727204261.39282: checking for any_errors_fatal 44109 1727204261.39290: done checking for any_errors_fatal 44109 1727204261.39291: checking for max_fail_percentage 44109 1727204261.39293: done checking for max_fail_percentage 44109 1727204261.39294: checking to see if all hosts have failed and the running result is not ok 44109 1727204261.39295: done checking to see if all hosts have failed 44109 1727204261.39296: getting the remaining hosts for this loop 44109 1727204261.39298: done getting the remaining hosts for this loop 44109 1727204261.39302: getting the next task for host managed-node1 44109 1727204261.39310: done getting next task for host managed-node1 44109 1727204261.39314: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44109 1727204261.39316: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204261.39333: getting variables 44109 1727204261.39336: in VariableManager get_vars() 44109 1727204261.39388: Calling all_inventory to load vars for managed-node1 44109 1727204261.39391: Calling groups_inventory to load vars for managed-node1 44109 1727204261.39395: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204261.39409: Calling all_plugins_play to load vars for managed-node1 44109 1727204261.39413: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204261.39416: Calling groups_plugins_play to load vars for managed-node1 44109 1727204261.40088: done sending task result for task 028d2410-947f-ed67-a560-00000000009b 44109 1727204261.40091: WORKER PROCESS EXITING 44109 1727204261.40988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204261.42576: done with get_vars() 44109 1727204261.42601: done getting variables 44109 1727204261.42658: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.059) 0:00:38.223 ***** 44109 1727204261.42691: entering _queue_task() for managed-node1/service 44109 1727204261.43023: worker is 1 (out of 1 available) 44109 1727204261.43035: exiting _queue_task() for managed-node1/service 44109 1727204261.43047: done queuing things up, now waiting for results queue to drain 44109 1727204261.43048: waiting for pending results... 44109 1727204261.43331: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44109 1727204261.43452: in run() - task 028d2410-947f-ed67-a560-00000000009c 44109 1727204261.43473: variable 'ansible_search_path' from source: unknown 44109 1727204261.43483: variable 'ansible_search_path' from source: unknown 44109 1727204261.43528: calling self._execute() 44109 1727204261.43634: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204261.43645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204261.43661: variable 'omit' from source: magic vars 44109 1727204261.44033: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.44053: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204261.44183: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204261.44393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204261.46984: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204261.47005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204261.47048: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204261.47092: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204261.47124: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204261.47209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.47246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.47277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.47325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.47413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.47417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.47425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.47454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.47501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.47527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.47570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.47600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.47632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.47675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.47696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.47872: variable 'network_connections' from source: play vars 44109 1727204261.47890: variable 'profile' from source: play vars 44109 1727204261.48062: variable 'profile' from source: play vars 44109 1727204261.48065: variable 'interface' from source: set_fact 44109 1727204261.48068: variable 'interface' from source: set_fact 44109 1727204261.48118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204261.48306: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204261.48349: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204261.48386: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204261.48418: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204261.48463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204261.48495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204261.48524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.48553: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204261.48609: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204261.48850: variable 'network_connections' from source: play vars 44109 1727204261.48860: variable 'profile' from source: play vars 44109 1727204261.48928: variable 'profile' from source: play vars 44109 1727204261.48937: variable 'interface' from source: set_fact 44109 1727204261.49000: variable 'interface' from source: set_fact 44109 1727204261.49033: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44109 1727204261.49142: when evaluation is False, skipping this task 44109 1727204261.49145: _execute() done 44109 1727204261.49147: dumping result to json 44109 1727204261.49149: done dumping result, returning 44109 1727204261.49152: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-ed67-a560-00000000009c] 44109 1727204261.49162: sending task result for task 028d2410-947f-ed67-a560-00000000009c 44109 1727204261.49234: done sending task result for task 028d2410-947f-ed67-a560-00000000009c 44109 1727204261.49237: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44109 1727204261.49295: no more pending results, returning what we have 44109 1727204261.49299: results queue empty 44109 1727204261.49300: checking for any_errors_fatal 44109 1727204261.49310: done checking for any_errors_fatal 44109 1727204261.49310: checking for max_fail_percentage 44109 1727204261.49312: done checking for max_fail_percentage 44109 1727204261.49313: checking to see if all hosts have failed and the running result is not ok 44109 1727204261.49314: done checking to see if all hosts have failed 44109 1727204261.49315: getting the remaining hosts for this loop 44109 1727204261.49316: done getting the remaining hosts for this loop 44109 1727204261.49320: getting the next task for host managed-node1 44109 1727204261.49327: done getting next task for host managed-node1 44109 1727204261.49331: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44109 1727204261.49333: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204261.49347: getting variables 44109 1727204261.49349: in VariableManager get_vars() 44109 1727204261.49389: Calling all_inventory to load vars for managed-node1 44109 1727204261.49392: Calling groups_inventory to load vars for managed-node1 44109 1727204261.49394: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204261.49404: Calling all_plugins_play to load vars for managed-node1 44109 1727204261.49407: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204261.49410: Calling groups_plugins_play to load vars for managed-node1 44109 1727204261.51230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204261.52743: done with get_vars() 44109 1727204261.52766: done getting variables 44109 1727204261.52825: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.101) 0:00:38.324 ***** 44109 1727204261.52853: entering _queue_task() for managed-node1/service 44109 1727204261.53387: worker is 1 (out of 1 available) 44109 1727204261.53397: exiting _queue_task() for managed-node1/service 44109 1727204261.53407: done queuing things up, now waiting for results queue to drain 44109 1727204261.53408: waiting for pending results... 44109 1727204261.53537: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44109 1727204261.53623: in run() - task 028d2410-947f-ed67-a560-00000000009d 44109 1727204261.53742: variable 'ansible_search_path' from source: unknown 44109 1727204261.53745: variable 'ansible_search_path' from source: unknown 44109 1727204261.53748: calling self._execute() 44109 1727204261.53804: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204261.53815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204261.53830: variable 'omit' from source: magic vars 44109 1727204261.54220: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.54238: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204261.54407: variable 'network_provider' from source: set_fact 44109 1727204261.54418: variable 'network_state' from source: role '' defaults 44109 1727204261.54433: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44109 1727204261.54443: variable 'omit' from source: magic vars 44109 1727204261.54490: variable 'omit' from source: magic vars 44109 1727204261.54528: variable 'network_service_name' from source: role '' defaults 44109 1727204261.54605: variable 'network_service_name' from source: role '' defaults 44109 1727204261.54712: variable '__network_provider_setup' from source: role '' defaults 44109 1727204261.54728: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204261.54829: variable '__network_service_name_default_nm' from source: role '' defaults 44109 1727204261.54832: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204261.54872: variable '__network_packages_default_nm' from source: role '' defaults 44109 1727204261.55109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204261.58215: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204261.58296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204261.58343: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204261.58482: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204261.58485: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204261.58514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.58550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.58582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.58632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.58649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.58702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.58732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.58759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.58805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.58827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.59060: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44109 1727204261.59186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.59215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.59355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.59359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.59361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.59401: variable 'ansible_python' from source: facts 44109 1727204261.59429: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44109 1727204261.59519: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204261.59604: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204261.59737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.59767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.59801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.59843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.59862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.59922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204261.59961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204261.59992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.60049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204261.60069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204261.60211: variable 'network_connections' from source: play vars 44109 1727204261.60282: variable 'profile' from source: play vars 44109 1727204261.60308: variable 'profile' from source: play vars 44109 1727204261.60324: variable 'interface' from source: set_fact 44109 1727204261.60384: variable 'interface' from source: set_fact 44109 1727204261.60492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204261.60687: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204261.60745: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204261.60797: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204261.60849: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204261.60989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204261.60993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204261.60995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204261.61022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204261.61072: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204261.61344: variable 'network_connections' from source: play vars 44109 1727204261.61531: variable 'profile' from source: play vars 44109 1727204261.61534: variable 'profile' from source: play vars 44109 1727204261.61536: variable 'interface' from source: set_fact 44109 1727204261.61686: variable 'interface' from source: set_fact 44109 1727204261.61726: variable '__network_packages_default_wireless' from source: role '' defaults 44109 1727204261.61940: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204261.62552: variable 'network_connections' from source: play vars 44109 1727204261.62561: variable 'profile' from source: play vars 44109 1727204261.62686: variable 'profile' from source: play vars 44109 1727204261.62734: variable 'interface' from source: set_fact 44109 1727204261.62943: variable 'interface' from source: set_fact 44109 1727204261.62954: variable '__network_packages_default_team' from source: role '' defaults 44109 1727204261.63036: variable '__network_team_connections_defined' from source: role '' defaults 44109 1727204261.63674: variable 'network_connections' from source: play vars 44109 1727204261.63806: variable 'profile' from source: play vars 44109 1727204261.63864: variable 'profile' from source: play vars 44109 1727204261.63924: variable 'interface' from source: set_fact 44109 1727204261.64023: variable 'interface' from source: set_fact 44109 1727204261.64190: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204261.64458: variable '__network_service_name_default_initscripts' from source: role '' defaults 44109 1727204261.64461: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204261.64463: variable '__network_packages_default_initscripts' from source: role '' defaults 44109 1727204261.64872: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44109 1727204261.65991: variable 'network_connections' from source: play vars 44109 1727204261.66001: variable 'profile' from source: play vars 44109 1727204261.66067: variable 'profile' from source: play vars 44109 1727204261.66189: variable 'interface' from source: set_fact 44109 1727204261.66265: variable 'interface' from source: set_fact 44109 1727204261.66318: variable 'ansible_distribution' from source: facts 44109 1727204261.66480: variable '__network_rh_distros' from source: role '' defaults 44109 1727204261.66484: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.66486: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44109 1727204261.66882: variable 'ansible_distribution' from source: facts 44109 1727204261.66885: variable '__network_rh_distros' from source: role '' defaults 44109 1727204261.66887: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.66889: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44109 1727204261.67158: variable 'ansible_distribution' from source: facts 44109 1727204261.67216: variable '__network_rh_distros' from source: role '' defaults 44109 1727204261.67226: variable 'ansible_distribution_major_version' from source: facts 44109 1727204261.67355: variable 'network_provider' from source: set_fact 44109 1727204261.67385: variable 'omit' from source: magic vars 44109 1727204261.67418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204261.67511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204261.67752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204261.67755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204261.67758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204261.67760: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204261.67762: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204261.67764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204261.67931: Set connection var ansible_connection to ssh 44109 1727204261.67943: Set connection var ansible_timeout to 10 44109 1727204261.67952: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204261.67976: Set connection var ansible_pipelining to False 44109 1727204261.68182: Set connection var ansible_shell_executable to /bin/sh 44109 1727204261.68185: Set connection var ansible_shell_type to sh 44109 1727204261.68187: variable 'ansible_shell_executable' from source: unknown 44109 1727204261.68189: variable 'ansible_connection' from source: unknown 44109 1727204261.68191: variable 'ansible_module_compression' from source: unknown 44109 1727204261.68193: variable 'ansible_shell_type' from source: unknown 44109 1727204261.68195: variable 'ansible_shell_executable' from source: unknown 44109 1727204261.68196: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204261.68203: variable 'ansible_pipelining' from source: unknown 44109 1727204261.68205: variable 'ansible_timeout' from source: unknown 44109 1727204261.68207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204261.68345: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204261.68616: variable 'omit' from source: magic vars 44109 1727204261.68619: starting attempt loop 44109 1727204261.68622: running the handler 44109 1727204261.68624: variable 'ansible_facts' from source: unknown 44109 1727204261.70621: _low_level_execute_command(): starting 44109 1727204261.70633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204261.72016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204261.72095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204261.72247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204261.72250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204261.72252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204261.72416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204261.74293: stdout chunk (state=3): >>>/root <<< 44109 1727204261.74534: stdout chunk (state=3): >>><<< 44109 1727204261.74538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204261.74541: stderr chunk (state=3): >>><<< 44109 1727204261.74544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204261.74546: _low_level_execute_command(): starting 44109 1727204261.74549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313 `" && echo ansible-tmp-1727204261.744343-46752-249874026570313="` echo /root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313 `" ) && sleep 0' 44109 1727204261.75654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204261.75668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204261.75680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204261.75863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204261.75958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204261.76089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204261.78188: stdout chunk (state=3): >>>ansible-tmp-1727204261.744343-46752-249874026570313=/root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313 <<< 44109 1727204261.78326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204261.78330: stdout chunk (state=3): >>><<< 44109 1727204261.78336: stderr chunk (state=3): >>><<< 44109 1727204261.78358: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204261.744343-46752-249874026570313=/root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204261.78392: variable 'ansible_module_compression' from source: unknown 44109 1727204261.78444: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44109 1727204261.78683: variable 'ansible_facts' from source: unknown 44109 1727204261.78937: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/AnsiballZ_systemd.py 44109 1727204261.79136: Sending initial data 44109 1727204261.79144: Sent initial data (155 bytes) 44109 1727204261.79748: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204261.79771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204261.79878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204261.79903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204261.80016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204261.81802: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204261.81867: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204261.81949: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpm0o_go99 /root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/AnsiballZ_systemd.py <<< 44109 1727204261.81952: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/AnsiballZ_systemd.py" <<< 44109 1727204261.82027: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpm0o_go99" to remote "/root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/AnsiballZ_systemd.py" <<< 44109 1727204261.84393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204261.84397: stdout chunk (state=3): >>><<< 44109 1727204261.84400: stderr chunk (state=3): >>><<< 44109 1727204261.84402: done transferring module to remote 44109 1727204261.84404: _low_level_execute_command(): starting 44109 1727204261.84406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/ /root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/AnsiballZ_systemd.py && sleep 0' 44109 1727204261.85021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204261.85039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204261.85054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204261.85136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204261.85180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204261.85202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204261.85243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204261.85493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204261.87767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204261.87770: stdout chunk (state=3): >>><<< 44109 1727204261.87773: stderr chunk (state=3): >>><<< 44109 1727204261.87776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204261.87779: _low_level_execute_command(): starting 44109 1727204261.87782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/AnsiballZ_systemd.py && sleep 0' 44109 1727204261.88891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204261.88978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204261.89020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204261.89057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204261.89199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204262.20036: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3279679488", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1828931000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 44109 1727204262.20051: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44109 1727204262.22220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204262.22252: stderr chunk (state=3): >>><<< 44109 1727204262.22256: stdout chunk (state=3): >>><<< 44109 1727204262.22274: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3279679488", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1828931000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204262.22396: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204262.22412: _low_level_execute_command(): starting 44109 1727204262.22419: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204261.744343-46752-249874026570313/ > /dev/null 2>&1 && sleep 0' 44109 1727204262.22852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204262.22895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204262.22898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204262.22900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204262.22902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204262.22904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204262.22906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204262.22950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204262.22953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204262.22956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204262.23039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204262.25000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204262.25026: stderr chunk (state=3): >>><<< 44109 1727204262.25029: stdout chunk (state=3): >>><<< 44109 1727204262.25044: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204262.25050: handler run complete 44109 1727204262.25098: attempt loop complete, returning result 44109 1727204262.25101: _execute() done 44109 1727204262.25104: dumping result to json 44109 1727204262.25117: done dumping result, returning 44109 1727204262.25126: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-ed67-a560-00000000009d] 44109 1727204262.25128: sending task result for task 028d2410-947f-ed67-a560-00000000009d 44109 1727204262.25336: done sending task result for task 028d2410-947f-ed67-a560-00000000009d 44109 1727204262.25339: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204262.25393: no more pending results, returning what we have 44109 1727204262.25396: results queue empty 44109 1727204262.25397: checking for any_errors_fatal 44109 1727204262.25404: done checking for any_errors_fatal 44109 1727204262.25405: checking for max_fail_percentage 44109 1727204262.25407: done checking for max_fail_percentage 44109 1727204262.25408: checking to see if all hosts have failed and the running result is not ok 44109 1727204262.25409: done checking to see if all hosts have failed 44109 1727204262.25410: getting the remaining hosts for this loop 44109 1727204262.25411: done getting the remaining hosts for this loop 44109 1727204262.25417: getting the next task for host managed-node1 44109 1727204262.25422: done getting next task for host managed-node1 44109 1727204262.25425: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44109 1727204262.25427: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204262.25435: getting variables 44109 1727204262.25437: in VariableManager get_vars() 44109 1727204262.25508: Calling all_inventory to load vars for managed-node1 44109 1727204262.25511: Calling groups_inventory to load vars for managed-node1 44109 1727204262.25516: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204262.25525: Calling all_plugins_play to load vars for managed-node1 44109 1727204262.25528: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204262.25530: Calling groups_plugins_play to load vars for managed-node1 44109 1727204262.26758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204262.28490: done with get_vars() 44109 1727204262.28517: done getting variables 44109 1727204262.28577: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.757) 0:00:39.082 ***** 44109 1727204262.28606: entering _queue_task() for managed-node1/service 44109 1727204262.28953: worker is 1 (out of 1 available) 44109 1727204262.28966: exiting _queue_task() for managed-node1/service 44109 1727204262.28981: done queuing things up, now waiting for results queue to drain 44109 1727204262.28982: waiting for pending results... 44109 1727204262.29397: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44109 1727204262.29402: in run() - task 028d2410-947f-ed67-a560-00000000009e 44109 1727204262.29425: variable 'ansible_search_path' from source: unknown 44109 1727204262.29432: variable 'ansible_search_path' from source: unknown 44109 1727204262.29494: calling self._execute() 44109 1727204262.29573: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204262.29586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204262.29605: variable 'omit' from source: magic vars 44109 1727204262.30037: variable 'ansible_distribution_major_version' from source: facts 44109 1727204262.30041: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204262.30135: variable 'network_provider' from source: set_fact 44109 1727204262.30152: Evaluated conditional (network_provider == "nm"): True 44109 1727204262.30242: variable '__network_wpa_supplicant_required' from source: role '' defaults 44109 1727204262.30329: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44109 1727204262.30506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204262.32711: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204262.32821: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204262.32832: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204262.32871: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204262.32906: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204262.33010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204262.33082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204262.33089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204262.33139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204262.33179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204262.33210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204262.33240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204262.33366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204262.33369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204262.33371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204262.33373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204262.33390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204262.33416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204262.33455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204262.33478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204262.33622: variable 'network_connections' from source: play vars 44109 1727204262.33637: variable 'profile' from source: play vars 44109 1727204262.33716: variable 'profile' from source: play vars 44109 1727204262.33726: variable 'interface' from source: set_fact 44109 1727204262.33786: variable 'interface' from source: set_fact 44109 1727204262.33860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44109 1727204262.34036: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44109 1727204262.34079: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44109 1727204262.34123: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44109 1727204262.34155: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44109 1727204262.34205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44109 1727204262.34242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44109 1727204262.34342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204262.34345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44109 1727204262.34369: variable '__network_wireless_connections_defined' from source: role '' defaults 44109 1727204262.34636: variable 'network_connections' from source: play vars 44109 1727204262.34647: variable 'profile' from source: play vars 44109 1727204262.34718: variable 'profile' from source: play vars 44109 1727204262.34728: variable 'interface' from source: set_fact 44109 1727204262.34794: variable 'interface' from source: set_fact 44109 1727204262.34831: Evaluated conditional (__network_wpa_supplicant_required): False 44109 1727204262.34839: when evaluation is False, skipping this task 44109 1727204262.34847: _execute() done 44109 1727204262.34882: dumping result to json 44109 1727204262.34884: done dumping result, returning 44109 1727204262.34887: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-ed67-a560-00000000009e] 44109 1727204262.34893: sending task result for task 028d2410-947f-ed67-a560-00000000009e 44109 1727204262.35281: done sending task result for task 028d2410-947f-ed67-a560-00000000009e 44109 1727204262.35284: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44109 1727204262.35328: no more pending results, returning what we have 44109 1727204262.35331: results queue empty 44109 1727204262.35332: checking for any_errors_fatal 44109 1727204262.35346: done checking for any_errors_fatal 44109 1727204262.35347: checking for max_fail_percentage 44109 1727204262.35349: done checking for max_fail_percentage 44109 1727204262.35350: checking to see if all hosts have failed and the running result is not ok 44109 1727204262.35351: done checking to see if all hosts have failed 44109 1727204262.35351: getting the remaining hosts for this loop 44109 1727204262.35353: done getting the remaining hosts for this loop 44109 1727204262.35356: getting the next task for host managed-node1 44109 1727204262.35362: done getting next task for host managed-node1 44109 1727204262.35366: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44109 1727204262.35368: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204262.35384: getting variables 44109 1727204262.35386: in VariableManager get_vars() 44109 1727204262.35424: Calling all_inventory to load vars for managed-node1 44109 1727204262.35427: Calling groups_inventory to load vars for managed-node1 44109 1727204262.35430: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204262.35440: Calling all_plugins_play to load vars for managed-node1 44109 1727204262.35443: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204262.35446: Calling groups_plugins_play to load vars for managed-node1 44109 1727204262.36895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204262.38477: done with get_vars() 44109 1727204262.38506: done getting variables 44109 1727204262.38568: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.099) 0:00:39.182 ***** 44109 1727204262.38600: entering _queue_task() for managed-node1/service 44109 1727204262.38961: worker is 1 (out of 1 available) 44109 1727204262.38974: exiting _queue_task() for managed-node1/service 44109 1727204262.39189: done queuing things up, now waiting for results queue to drain 44109 1727204262.39190: waiting for pending results... 44109 1727204262.39277: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 44109 1727204262.39396: in run() - task 028d2410-947f-ed67-a560-00000000009f 44109 1727204262.39423: variable 'ansible_search_path' from source: unknown 44109 1727204262.39430: variable 'ansible_search_path' from source: unknown 44109 1727204262.39467: calling self._execute() 44109 1727204262.39573: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204262.39588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204262.39602: variable 'omit' from source: magic vars 44109 1727204262.39999: variable 'ansible_distribution_major_version' from source: facts 44109 1727204262.40018: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204262.40142: variable 'network_provider' from source: set_fact 44109 1727204262.40153: Evaluated conditional (network_provider == "initscripts"): False 44109 1727204262.40160: when evaluation is False, skipping this task 44109 1727204262.40169: _execute() done 44109 1727204262.40183: dumping result to json 44109 1727204262.40190: done dumping result, returning 44109 1727204262.40200: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-ed67-a560-00000000009f] 44109 1727204262.40209: sending task result for task 028d2410-947f-ed67-a560-00000000009f 44109 1727204262.40419: done sending task result for task 028d2410-947f-ed67-a560-00000000009f 44109 1727204262.40422: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44109 1727204262.40471: no more pending results, returning what we have 44109 1727204262.40477: results queue empty 44109 1727204262.40478: checking for any_errors_fatal 44109 1727204262.40489: done checking for any_errors_fatal 44109 1727204262.40490: checking for max_fail_percentage 44109 1727204262.40492: done checking for max_fail_percentage 44109 1727204262.40493: checking to see if all hosts have failed and the running result is not ok 44109 1727204262.40493: done checking to see if all hosts have failed 44109 1727204262.40494: getting the remaining hosts for this loop 44109 1727204262.40495: done getting the remaining hosts for this loop 44109 1727204262.40499: getting the next task for host managed-node1 44109 1727204262.40506: done getting next task for host managed-node1 44109 1727204262.40509: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44109 1727204262.40512: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204262.40529: getting variables 44109 1727204262.40530: in VariableManager get_vars() 44109 1727204262.40567: Calling all_inventory to load vars for managed-node1 44109 1727204262.40570: Calling groups_inventory to load vars for managed-node1 44109 1727204262.40572: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204262.40785: Calling all_plugins_play to load vars for managed-node1 44109 1727204262.40789: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204262.40793: Calling groups_plugins_play to load vars for managed-node1 44109 1727204262.42380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204262.43952: done with get_vars() 44109 1727204262.43983: done getting variables 44109 1727204262.44046: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.054) 0:00:39.237 ***** 44109 1727204262.44079: entering _queue_task() for managed-node1/copy 44109 1727204262.44444: worker is 1 (out of 1 available) 44109 1727204262.44456: exiting _queue_task() for managed-node1/copy 44109 1727204262.44468: done queuing things up, now waiting for results queue to drain 44109 1727204262.44469: waiting for pending results... 44109 1727204262.44893: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44109 1727204262.44898: in run() - task 028d2410-947f-ed67-a560-0000000000a0 44109 1727204262.44900: variable 'ansible_search_path' from source: unknown 44109 1727204262.44903: variable 'ansible_search_path' from source: unknown 44109 1727204262.44945: calling self._execute() 44109 1727204262.45052: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204262.45064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204262.45079: variable 'omit' from source: magic vars 44109 1727204262.45475: variable 'ansible_distribution_major_version' from source: facts 44109 1727204262.45494: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204262.45620: variable 'network_provider' from source: set_fact 44109 1727204262.45632: Evaluated conditional (network_provider == "initscripts"): False 44109 1727204262.45640: when evaluation is False, skipping this task 44109 1727204262.45647: _execute() done 44109 1727204262.45654: dumping result to json 44109 1727204262.45671: done dumping result, returning 44109 1727204262.45679: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-ed67-a560-0000000000a0] 44109 1727204262.45782: sending task result for task 028d2410-947f-ed67-a560-0000000000a0 44109 1727204262.45854: done sending task result for task 028d2410-947f-ed67-a560-0000000000a0 44109 1727204262.45857: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44109 1727204262.45933: no more pending results, returning what we have 44109 1727204262.45938: results queue empty 44109 1727204262.45939: checking for any_errors_fatal 44109 1727204262.45944: done checking for any_errors_fatal 44109 1727204262.45945: checking for max_fail_percentage 44109 1727204262.45947: done checking for max_fail_percentage 44109 1727204262.45948: checking to see if all hosts have failed and the running result is not ok 44109 1727204262.45948: done checking to see if all hosts have failed 44109 1727204262.45949: getting the remaining hosts for this loop 44109 1727204262.45950: done getting the remaining hosts for this loop 44109 1727204262.45954: getting the next task for host managed-node1 44109 1727204262.45961: done getting next task for host managed-node1 44109 1727204262.45965: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44109 1727204262.45967: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204262.45983: getting variables 44109 1727204262.45984: in VariableManager get_vars() 44109 1727204262.46024: Calling all_inventory to load vars for managed-node1 44109 1727204262.46027: Calling groups_inventory to load vars for managed-node1 44109 1727204262.46029: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204262.46041: Calling all_plugins_play to load vars for managed-node1 44109 1727204262.46044: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204262.46047: Calling groups_plugins_play to load vars for managed-node1 44109 1727204262.47669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204262.49257: done with get_vars() 44109 1727204262.49286: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.052) 0:00:39.289 ***** 44109 1727204262.49371: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44109 1727204262.49705: worker is 1 (out of 1 available) 44109 1727204262.49720: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44109 1727204262.49732: done queuing things up, now waiting for results queue to drain 44109 1727204262.49733: waiting for pending results... 44109 1727204262.50017: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44109 1727204262.50182: in run() - task 028d2410-947f-ed67-a560-0000000000a1 44109 1727204262.50185: variable 'ansible_search_path' from source: unknown 44109 1727204262.50187: variable 'ansible_search_path' from source: unknown 44109 1727204262.50196: calling self._execute() 44109 1727204262.50299: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204262.50319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204262.50334: variable 'omit' from source: magic vars 44109 1727204262.50714: variable 'ansible_distribution_major_version' from source: facts 44109 1727204262.50746: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204262.50749: variable 'omit' from source: magic vars 44109 1727204262.50981: variable 'omit' from source: magic vars 44109 1727204262.50985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44109 1727204262.53416: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44109 1727204262.53487: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44109 1727204262.53538: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44109 1727204262.53580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44109 1727204262.53620: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44109 1727204262.53700: variable 'network_provider' from source: set_fact 44109 1727204262.53843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44109 1727204262.53874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44109 1727204262.53904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44109 1727204262.53952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44109 1727204262.53970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44109 1727204262.54048: variable 'omit' from source: magic vars 44109 1727204262.54164: variable 'omit' from source: magic vars 44109 1727204262.54267: variable 'network_connections' from source: play vars 44109 1727204262.54282: variable 'profile' from source: play vars 44109 1727204262.54350: variable 'profile' from source: play vars 44109 1727204262.54359: variable 'interface' from source: set_fact 44109 1727204262.54422: variable 'interface' from source: set_fact 44109 1727204262.54563: variable 'omit' from source: magic vars 44109 1727204262.54581: variable '__lsr_ansible_managed' from source: task vars 44109 1727204262.54644: variable '__lsr_ansible_managed' from source: task vars 44109 1727204262.54980: Loaded config def from plugin (lookup/template) 44109 1727204262.54983: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44109 1727204262.54985: File lookup term: get_ansible_managed.j2 44109 1727204262.54988: variable 'ansible_search_path' from source: unknown 44109 1727204262.54990: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44109 1727204262.55003: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44109 1727204262.55031: variable 'ansible_search_path' from source: unknown 44109 1727204262.65591: variable 'ansible_managed' from source: unknown 44109 1727204262.65674: variable 'omit' from source: magic vars 44109 1727204262.65696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204262.65714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204262.65728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204262.65739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204262.65746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204262.65761: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204262.65764: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204262.65766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204262.65827: Set connection var ansible_connection to ssh 44109 1727204262.65830: Set connection var ansible_timeout to 10 44109 1727204262.65836: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204262.65843: Set connection var ansible_pipelining to False 44109 1727204262.65847: Set connection var ansible_shell_executable to /bin/sh 44109 1727204262.65852: Set connection var ansible_shell_type to sh 44109 1727204262.65869: variable 'ansible_shell_executable' from source: unknown 44109 1727204262.65872: variable 'ansible_connection' from source: unknown 44109 1727204262.65875: variable 'ansible_module_compression' from source: unknown 44109 1727204262.65879: variable 'ansible_shell_type' from source: unknown 44109 1727204262.65882: variable 'ansible_shell_executable' from source: unknown 44109 1727204262.65884: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204262.65886: variable 'ansible_pipelining' from source: unknown 44109 1727204262.65888: variable 'ansible_timeout' from source: unknown 44109 1727204262.65893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204262.65980: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204262.65991: variable 'omit' from source: magic vars 44109 1727204262.65994: starting attempt loop 44109 1727204262.65996: running the handler 44109 1727204262.66004: _low_level_execute_command(): starting 44109 1727204262.66008: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204262.66708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204262.66712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204262.66715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204262.66717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204262.66784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204262.68591: stdout chunk (state=3): >>>/root <<< 44109 1727204262.68707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204262.68748: stderr chunk (state=3): >>><<< 44109 1727204262.68751: stdout chunk (state=3): >>><<< 44109 1727204262.68780: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204262.68791: _low_level_execute_command(): starting 44109 1727204262.68797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452 `" && echo ansible-tmp-1727204262.6878002-46790-79204159447452="` echo /root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452 `" ) && sleep 0' 44109 1727204262.69414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204262.69510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204262.71609: stdout chunk (state=3): >>>ansible-tmp-1727204262.6878002-46790-79204159447452=/root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452 <<< 44109 1727204262.71740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204262.71759: stderr chunk (state=3): >>><<< 44109 1727204262.71762: stdout chunk (state=3): >>><<< 44109 1727204262.71800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204262.6878002-46790-79204159447452=/root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204262.71849: variable 'ansible_module_compression' from source: unknown 44109 1727204262.72082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44109 1727204262.72086: variable 'ansible_facts' from source: unknown 44109 1727204262.72089: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/AnsiballZ_network_connections.py 44109 1727204262.72218: Sending initial data 44109 1727204262.72222: Sent initial data (167 bytes) 44109 1727204262.72761: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204262.72767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204262.72787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204262.72800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204262.72818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204262.72859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204262.72894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204262.72969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204262.74728: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204262.74809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204262.74888: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmprsar_vgc /root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/AnsiballZ_network_connections.py <<< 44109 1727204262.74899: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/AnsiballZ_network_connections.py" <<< 44109 1727204262.74951: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmprsar_vgc" to remote "/root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/AnsiballZ_network_connections.py" <<< 44109 1727204262.76519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204262.76563: stderr chunk (state=3): >>><<< 44109 1727204262.76567: stdout chunk (state=3): >>><<< 44109 1727204262.76589: done transferring module to remote 44109 1727204262.76620: _low_level_execute_command(): starting 44109 1727204262.76634: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/ /root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/AnsiballZ_network_connections.py && sleep 0' 44109 1727204262.77269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204262.77297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204262.77449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204262.77453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204262.77455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204262.77457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204262.77533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204262.79499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204262.79565: stderr chunk (state=3): >>><<< 44109 1727204262.79568: stdout chunk (state=3): >>><<< 44109 1727204262.79656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204262.79659: _low_level_execute_command(): starting 44109 1727204262.79662: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/AnsiballZ_network_connections.py && sleep 0' 44109 1727204262.80196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204262.80211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204262.80234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204262.80252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204262.80342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204262.80367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204262.80384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204262.80410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204262.80529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204263.09669: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_z7rj32mr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_z7rj32mr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/51eb3d23-af2d-42f8-aa46-41043d97d664: error=unknown <<< 44109 1727204263.09823: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44109 1727204263.12097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204263.12107: stdout chunk (state=3): >>><<< 44109 1727204263.12109: stderr chunk (state=3): >>><<< 44109 1727204263.12387: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_z7rj32mr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_z7rj32mr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/51eb3d23-af2d-42f8-aa46-41043d97d664: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204263.12391: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204263.12394: _low_level_execute_command(): starting 44109 1727204263.12396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204262.6878002-46790-79204159447452/ > /dev/null 2>&1 && sleep 0' 44109 1727204263.13132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204263.13140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204263.13151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204263.13190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.13262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204263.13287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204263.13390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204263.15582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204263.15585: stdout chunk (state=3): >>><<< 44109 1727204263.15588: stderr chunk (state=3): >>><<< 44109 1727204263.15591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204263.15593: handler run complete 44109 1727204263.15596: attempt loop complete, returning result 44109 1727204263.15598: _execute() done 44109 1727204263.15600: dumping result to json 44109 1727204263.15602: done dumping result, returning 44109 1727204263.15605: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-ed67-a560-0000000000a1] 44109 1727204263.15611: sending task result for task 028d2410-947f-ed67-a560-0000000000a1 changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 44109 1727204263.15783: no more pending results, returning what we have 44109 1727204263.15786: results queue empty 44109 1727204263.15787: checking for any_errors_fatal 44109 1727204263.15794: done checking for any_errors_fatal 44109 1727204263.15795: checking for max_fail_percentage 44109 1727204263.15797: done checking for max_fail_percentage 44109 1727204263.15798: checking to see if all hosts have failed and the running result is not ok 44109 1727204263.15799: done checking to see if all hosts have failed 44109 1727204263.15799: getting the remaining hosts for this loop 44109 1727204263.15801: done getting the remaining hosts for this loop 44109 1727204263.15804: getting the next task for host managed-node1 44109 1727204263.15810: done getting next task for host managed-node1 44109 1727204263.15816: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44109 1727204263.15818: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204263.15829: getting variables 44109 1727204263.15830: in VariableManager get_vars() 44109 1727204263.16274: Calling all_inventory to load vars for managed-node1 44109 1727204263.16280: Calling groups_inventory to load vars for managed-node1 44109 1727204263.16282: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204263.16296: Calling all_plugins_play to load vars for managed-node1 44109 1727204263.16298: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204263.16301: Calling groups_plugins_play to load vars for managed-node1 44109 1727204263.16935: done sending task result for task 028d2410-947f-ed67-a560-0000000000a1 44109 1727204263.16939: WORKER PROCESS EXITING 44109 1727204263.17900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204263.19599: done with get_vars() 44109 1727204263.19635: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.703) 0:00:39.993 ***** 44109 1727204263.19719: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44109 1727204263.20253: worker is 1 (out of 1 available) 44109 1727204263.20265: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44109 1727204263.20299: done queuing things up, now waiting for results queue to drain 44109 1727204263.20301: waiting for pending results... 44109 1727204263.20549: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 44109 1727204263.20681: in run() - task 028d2410-947f-ed67-a560-0000000000a2 44109 1727204263.20707: variable 'ansible_search_path' from source: unknown 44109 1727204263.20720: variable 'ansible_search_path' from source: unknown 44109 1727204263.20768: calling self._execute() 44109 1727204263.20896: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.20908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.20927: variable 'omit' from source: magic vars 44109 1727204263.21351: variable 'ansible_distribution_major_version' from source: facts 44109 1727204263.21428: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204263.21503: variable 'network_state' from source: role '' defaults 44109 1727204263.21522: Evaluated conditional (network_state != {}): False 44109 1727204263.21535: when evaluation is False, skipping this task 44109 1727204263.21548: _execute() done 44109 1727204263.21556: dumping result to json 44109 1727204263.21563: done dumping result, returning 44109 1727204263.21573: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-ed67-a560-0000000000a2] 44109 1727204263.21648: sending task result for task 028d2410-947f-ed67-a560-0000000000a2 44109 1727204263.21723: done sending task result for task 028d2410-947f-ed67-a560-0000000000a2 44109 1727204263.21727: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44109 1727204263.21793: no more pending results, returning what we have 44109 1727204263.21798: results queue empty 44109 1727204263.21799: checking for any_errors_fatal 44109 1727204263.21811: done checking for any_errors_fatal 44109 1727204263.21815: checking for max_fail_percentage 44109 1727204263.21817: done checking for max_fail_percentage 44109 1727204263.21818: checking to see if all hosts have failed and the running result is not ok 44109 1727204263.21819: done checking to see if all hosts have failed 44109 1727204263.21820: getting the remaining hosts for this loop 44109 1727204263.21821: done getting the remaining hosts for this loop 44109 1727204263.21825: getting the next task for host managed-node1 44109 1727204263.21832: done getting next task for host managed-node1 44109 1727204263.21836: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44109 1727204263.21838: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204263.21855: getting variables 44109 1727204263.21859: in VariableManager get_vars() 44109 1727204263.21900: Calling all_inventory to load vars for managed-node1 44109 1727204263.21904: Calling groups_inventory to load vars for managed-node1 44109 1727204263.21907: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204263.21922: Calling all_plugins_play to load vars for managed-node1 44109 1727204263.21925: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204263.21928: Calling groups_plugins_play to load vars for managed-node1 44109 1727204263.23621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204263.25461: done with get_vars() 44109 1727204263.25485: done getting variables 44109 1727204263.25555: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.058) 0:00:40.052 ***** 44109 1727204263.25589: entering _queue_task() for managed-node1/debug 44109 1727204263.25943: worker is 1 (out of 1 available) 44109 1727204263.25957: exiting _queue_task() for managed-node1/debug 44109 1727204263.26084: done queuing things up, now waiting for results queue to drain 44109 1727204263.26086: waiting for pending results... 44109 1727204263.26398: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44109 1727204263.26494: in run() - task 028d2410-947f-ed67-a560-0000000000a3 44109 1727204263.26498: variable 'ansible_search_path' from source: unknown 44109 1727204263.26501: variable 'ansible_search_path' from source: unknown 44109 1727204263.26516: calling self._execute() 44109 1727204263.26635: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.26647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.26714: variable 'omit' from source: magic vars 44109 1727204263.27079: variable 'ansible_distribution_major_version' from source: facts 44109 1727204263.27095: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204263.27107: variable 'omit' from source: magic vars 44109 1727204263.27164: variable 'omit' from source: magic vars 44109 1727204263.27207: variable 'omit' from source: magic vars 44109 1727204263.27265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204263.27308: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204263.27364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204263.27367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204263.27382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204263.27420: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204263.27472: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.27475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.27555: Set connection var ansible_connection to ssh 44109 1727204263.27567: Set connection var ansible_timeout to 10 44109 1727204263.27587: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204263.27600: Set connection var ansible_pipelining to False 44109 1727204263.27610: Set connection var ansible_shell_executable to /bin/sh 44109 1727204263.27622: Set connection var ansible_shell_type to sh 44109 1727204263.27649: variable 'ansible_shell_executable' from source: unknown 44109 1727204263.27690: variable 'ansible_connection' from source: unknown 44109 1727204263.27693: variable 'ansible_module_compression' from source: unknown 44109 1727204263.27695: variable 'ansible_shell_type' from source: unknown 44109 1727204263.27697: variable 'ansible_shell_executable' from source: unknown 44109 1727204263.27698: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.27700: variable 'ansible_pipelining' from source: unknown 44109 1727204263.27701: variable 'ansible_timeout' from source: unknown 44109 1727204263.27703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.27832: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204263.27848: variable 'omit' from source: magic vars 44109 1727204263.27909: starting attempt loop 44109 1727204263.27914: running the handler 44109 1727204263.27999: variable '__network_connections_result' from source: set_fact 44109 1727204263.28060: handler run complete 44109 1727204263.28083: attempt loop complete, returning result 44109 1727204263.28089: _execute() done 44109 1727204263.28094: dumping result to json 44109 1727204263.28100: done dumping result, returning 44109 1727204263.28111: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-ed67-a560-0000000000a3] 44109 1727204263.28129: sending task result for task 028d2410-947f-ed67-a560-0000000000a3 ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 44109 1727204263.28294: no more pending results, returning what we have 44109 1727204263.28298: results queue empty 44109 1727204263.28299: checking for any_errors_fatal 44109 1727204263.28307: done checking for any_errors_fatal 44109 1727204263.28308: checking for max_fail_percentage 44109 1727204263.28310: done checking for max_fail_percentage 44109 1727204263.28311: checking to see if all hosts have failed and the running result is not ok 44109 1727204263.28314: done checking to see if all hosts have failed 44109 1727204263.28315: getting the remaining hosts for this loop 44109 1727204263.28316: done getting the remaining hosts for this loop 44109 1727204263.28320: getting the next task for host managed-node1 44109 1727204263.28328: done getting next task for host managed-node1 44109 1727204263.28332: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44109 1727204263.28343: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204263.28356: getting variables 44109 1727204263.28358: in VariableManager get_vars() 44109 1727204263.28396: Calling all_inventory to load vars for managed-node1 44109 1727204263.28399: Calling groups_inventory to load vars for managed-node1 44109 1727204263.28401: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204263.28411: Calling all_plugins_play to load vars for managed-node1 44109 1727204263.28417: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204263.28420: Calling groups_plugins_play to load vars for managed-node1 44109 1727204263.29102: done sending task result for task 028d2410-947f-ed67-a560-0000000000a3 44109 1727204263.29106: WORKER PROCESS EXITING 44109 1727204263.36132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204263.37742: done with get_vars() 44109 1727204263.37768: done getting variables 44109 1727204263.37821: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.122) 0:00:40.174 ***** 44109 1727204263.37847: entering _queue_task() for managed-node1/debug 44109 1727204263.38205: worker is 1 (out of 1 available) 44109 1727204263.38220: exiting _queue_task() for managed-node1/debug 44109 1727204263.38232: done queuing things up, now waiting for results queue to drain 44109 1727204263.38233: waiting for pending results... 44109 1727204263.38532: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44109 1727204263.38656: in run() - task 028d2410-947f-ed67-a560-0000000000a4 44109 1727204263.38674: variable 'ansible_search_path' from source: unknown 44109 1727204263.38686: variable 'ansible_search_path' from source: unknown 44109 1727204263.38734: calling self._execute() 44109 1727204263.38843: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.38881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.38884: variable 'omit' from source: magic vars 44109 1727204263.39268: variable 'ansible_distribution_major_version' from source: facts 44109 1727204263.39286: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204263.39296: variable 'omit' from source: magic vars 44109 1727204263.39352: variable 'omit' from source: magic vars 44109 1727204263.39457: variable 'omit' from source: magic vars 44109 1727204263.39460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204263.39481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204263.39507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204263.39532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204263.39547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204263.39586: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204263.39594: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.39601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.39707: Set connection var ansible_connection to ssh 44109 1727204263.39790: Set connection var ansible_timeout to 10 44109 1727204263.39793: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204263.39796: Set connection var ansible_pipelining to False 44109 1727204263.39798: Set connection var ansible_shell_executable to /bin/sh 44109 1727204263.39801: Set connection var ansible_shell_type to sh 44109 1727204263.39802: variable 'ansible_shell_executable' from source: unknown 44109 1727204263.39805: variable 'ansible_connection' from source: unknown 44109 1727204263.39807: variable 'ansible_module_compression' from source: unknown 44109 1727204263.39809: variable 'ansible_shell_type' from source: unknown 44109 1727204263.39811: variable 'ansible_shell_executable' from source: unknown 44109 1727204263.39815: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.39817: variable 'ansible_pipelining' from source: unknown 44109 1727204263.39819: variable 'ansible_timeout' from source: unknown 44109 1727204263.39821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.39972: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204263.39992: variable 'omit' from source: magic vars 44109 1727204263.40008: starting attempt loop 44109 1727204263.40018: running the handler 44109 1727204263.40068: variable '__network_connections_result' from source: set_fact 44109 1727204263.40281: variable '__network_connections_result' from source: set_fact 44109 1727204263.40284: handler run complete 44109 1727204263.40286: attempt loop complete, returning result 44109 1727204263.40289: _execute() done 44109 1727204263.40294: dumping result to json 44109 1727204263.40303: done dumping result, returning 44109 1727204263.40317: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-ed67-a560-0000000000a4] 44109 1727204263.40327: sending task result for task 028d2410-947f-ed67-a560-0000000000a4 44109 1727204263.40581: done sending task result for task 028d2410-947f-ed67-a560-0000000000a4 44109 1727204263.40585: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 44109 1727204263.40668: no more pending results, returning what we have 44109 1727204263.40672: results queue empty 44109 1727204263.40673: checking for any_errors_fatal 44109 1727204263.40683: done checking for any_errors_fatal 44109 1727204263.40684: checking for max_fail_percentage 44109 1727204263.40686: done checking for max_fail_percentage 44109 1727204263.40687: checking to see if all hosts have failed and the running result is not ok 44109 1727204263.40688: done checking to see if all hosts have failed 44109 1727204263.40688: getting the remaining hosts for this loop 44109 1727204263.40690: done getting the remaining hosts for this loop 44109 1727204263.40693: getting the next task for host managed-node1 44109 1727204263.40700: done getting next task for host managed-node1 44109 1727204263.40704: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44109 1727204263.40706: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204263.40718: getting variables 44109 1727204263.40720: in VariableManager get_vars() 44109 1727204263.40756: Calling all_inventory to load vars for managed-node1 44109 1727204263.40759: Calling groups_inventory to load vars for managed-node1 44109 1727204263.40762: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204263.40772: Calling all_plugins_play to load vars for managed-node1 44109 1727204263.40959: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204263.40967: Calling groups_plugins_play to load vars for managed-node1 44109 1727204263.42817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204263.45262: done with get_vars() 44109 1727204263.45291: done getting variables 44109 1727204263.45356: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.075) 0:00:40.250 ***** 44109 1727204263.45396: entering _queue_task() for managed-node1/debug 44109 1727204263.45760: worker is 1 (out of 1 available) 44109 1727204263.45770: exiting _queue_task() for managed-node1/debug 44109 1727204263.45986: done queuing things up, now waiting for results queue to drain 44109 1727204263.45987: waiting for pending results... 44109 1727204263.46083: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44109 1727204263.46198: in run() - task 028d2410-947f-ed67-a560-0000000000a5 44109 1727204263.46225: variable 'ansible_search_path' from source: unknown 44109 1727204263.46233: variable 'ansible_search_path' from source: unknown 44109 1727204263.46272: calling self._execute() 44109 1727204263.46399: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.46410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.46430: variable 'omit' from source: magic vars 44109 1727204263.46974: variable 'ansible_distribution_major_version' from source: facts 44109 1727204263.46980: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204263.47188: variable 'network_state' from source: role '' defaults 44109 1727204263.47205: Evaluated conditional (network_state != {}): False 44109 1727204263.47215: when evaluation is False, skipping this task 44109 1727204263.47222: _execute() done 44109 1727204263.47229: dumping result to json 44109 1727204263.47386: done dumping result, returning 44109 1727204263.47389: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-ed67-a560-0000000000a5] 44109 1727204263.47392: sending task result for task 028d2410-947f-ed67-a560-0000000000a5 44109 1727204263.47469: done sending task result for task 028d2410-947f-ed67-a560-0000000000a5 44109 1727204263.47472: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 44109 1727204263.47527: no more pending results, returning what we have 44109 1727204263.47531: results queue empty 44109 1727204263.47532: checking for any_errors_fatal 44109 1727204263.47540: done checking for any_errors_fatal 44109 1727204263.47541: checking for max_fail_percentage 44109 1727204263.47543: done checking for max_fail_percentage 44109 1727204263.47544: checking to see if all hosts have failed and the running result is not ok 44109 1727204263.47545: done checking to see if all hosts have failed 44109 1727204263.47546: getting the remaining hosts for this loop 44109 1727204263.47547: done getting the remaining hosts for this loop 44109 1727204263.47551: getting the next task for host managed-node1 44109 1727204263.47558: done getting next task for host managed-node1 44109 1727204263.47562: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44109 1727204263.47565: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204263.47581: getting variables 44109 1727204263.47583: in VariableManager get_vars() 44109 1727204263.47625: Calling all_inventory to load vars for managed-node1 44109 1727204263.47628: Calling groups_inventory to load vars for managed-node1 44109 1727204263.47631: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204263.47643: Calling all_plugins_play to load vars for managed-node1 44109 1727204263.47647: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204263.47649: Calling groups_plugins_play to load vars for managed-node1 44109 1727204263.49705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204263.51955: done with get_vars() 44109 1727204263.51987: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.068) 0:00:40.318 ***** 44109 1727204263.52195: entering _queue_task() for managed-node1/ping 44109 1727204263.52811: worker is 1 (out of 1 available) 44109 1727204263.52828: exiting _queue_task() for managed-node1/ping 44109 1727204263.52840: done queuing things up, now waiting for results queue to drain 44109 1727204263.52841: waiting for pending results... 44109 1727204263.53595: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 44109 1727204263.53600: in run() - task 028d2410-947f-ed67-a560-0000000000a6 44109 1727204263.53603: variable 'ansible_search_path' from source: unknown 44109 1727204263.53605: variable 'ansible_search_path' from source: unknown 44109 1727204263.53608: calling self._execute() 44109 1727204263.53658: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.53669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.53686: variable 'omit' from source: magic vars 44109 1727204263.54098: variable 'ansible_distribution_major_version' from source: facts 44109 1727204263.54119: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204263.54131: variable 'omit' from source: magic vars 44109 1727204263.54182: variable 'omit' from source: magic vars 44109 1727204263.54208: variable 'omit' from source: magic vars 44109 1727204263.54260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204263.54289: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204263.54304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204263.54321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204263.54333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204263.54357: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204263.54360: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.54363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.54443: Set connection var ansible_connection to ssh 44109 1727204263.54447: Set connection var ansible_timeout to 10 44109 1727204263.54452: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204263.54459: Set connection var ansible_pipelining to False 44109 1727204263.54464: Set connection var ansible_shell_executable to /bin/sh 44109 1727204263.54469: Set connection var ansible_shell_type to sh 44109 1727204263.54490: variable 'ansible_shell_executable' from source: unknown 44109 1727204263.54493: variable 'ansible_connection' from source: unknown 44109 1727204263.54496: variable 'ansible_module_compression' from source: unknown 44109 1727204263.54498: variable 'ansible_shell_type' from source: unknown 44109 1727204263.54500: variable 'ansible_shell_executable' from source: unknown 44109 1727204263.54503: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204263.54505: variable 'ansible_pipelining' from source: unknown 44109 1727204263.54507: variable 'ansible_timeout' from source: unknown 44109 1727204263.54511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204263.54664: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204263.54673: variable 'omit' from source: magic vars 44109 1727204263.54678: starting attempt loop 44109 1727204263.54681: running the handler 44109 1727204263.54697: _low_level_execute_command(): starting 44109 1727204263.54701: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204263.55468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204263.55521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204263.55524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.55526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204263.55530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204263.55533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.55592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204263.55595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204263.55598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204263.55690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204263.57503: stdout chunk (state=3): >>>/root <<< 44109 1727204263.57641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204263.57643: stdout chunk (state=3): >>><<< 44109 1727204263.57645: stderr chunk (state=3): >>><<< 44109 1727204263.57660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204263.57700: _low_level_execute_command(): starting 44109 1727204263.57704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689 `" && echo ansible-tmp-1727204263.5766487-46840-178246863236689="` echo /root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689 `" ) && sleep 0' 44109 1727204263.58212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.58269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204263.58272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204263.58367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204263.60515: stdout chunk (state=3): >>>ansible-tmp-1727204263.5766487-46840-178246863236689=/root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689 <<< 44109 1727204263.60619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204263.60644: stderr chunk (state=3): >>><<< 44109 1727204263.60650: stdout chunk (state=3): >>><<< 44109 1727204263.60668: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204263.5766487-46840-178246863236689=/root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204263.60711: variable 'ansible_module_compression' from source: unknown 44109 1727204263.60746: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44109 1727204263.60782: variable 'ansible_facts' from source: unknown 44109 1727204263.60834: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/AnsiballZ_ping.py 44109 1727204263.60931: Sending initial data 44109 1727204263.60942: Sent initial data (153 bytes) 44109 1727204263.61374: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204263.61381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204263.61384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.61386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204263.61388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204263.61390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.61443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204263.61535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204263.63290: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204263.63360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204263.63438: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpqenty35c /root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/AnsiballZ_ping.py <<< 44109 1727204263.63444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/AnsiballZ_ping.py" <<< 44109 1727204263.63514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpqenty35c" to remote "/root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/AnsiballZ_ping.py" <<< 44109 1727204263.64152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204263.64195: stderr chunk (state=3): >>><<< 44109 1727204263.64198: stdout chunk (state=3): >>><<< 44109 1727204263.64238: done transferring module to remote 44109 1727204263.64247: _low_level_execute_command(): starting 44109 1727204263.64251: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/ /root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/AnsiballZ_ping.py && sleep 0' 44109 1727204263.64660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204263.64691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204263.64694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204263.64697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.64699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204263.64701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.64752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204263.64755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204263.64853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204263.66788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204263.66811: stderr chunk (state=3): >>><<< 44109 1727204263.66815: stdout chunk (state=3): >>><<< 44109 1727204263.66831: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204263.66834: _low_level_execute_command(): starting 44109 1727204263.66839: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/AnsiballZ_ping.py && sleep 0' 44109 1727204263.67246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204263.67282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204263.67285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.67287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204263.67290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204263.67291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.67339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204263.67345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204263.67428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204263.83670: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44109 1727204263.85387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204263.85391: stdout chunk (state=3): >>><<< 44109 1727204263.85393: stderr chunk (state=3): >>><<< 44109 1727204263.85396: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204263.85402: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204263.85414: _low_level_execute_command(): starting 44109 1727204263.85418: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204263.5766487-46840-178246863236689/ > /dev/null 2>&1 && sleep 0' 44109 1727204263.86015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204263.86020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204263.86122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204263.86148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204263.86151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 44109 1727204263.86167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.86172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204263.86187: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204263.86192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204263.86257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 44109 1727204263.86261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204263.86325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204263.86398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204263.88823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204263.88832: stdout chunk (state=3): >>><<< 44109 1727204263.88834: stderr chunk (state=3): >>><<< 44109 1727204263.88837: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204263.88839: handler run complete 44109 1727204263.88841: attempt loop complete, returning result 44109 1727204263.88843: _execute() done 44109 1727204263.88845: dumping result to json 44109 1727204263.88846: done dumping result, returning 44109 1727204263.88848: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-ed67-a560-0000000000a6] 44109 1727204263.88850: sending task result for task 028d2410-947f-ed67-a560-0000000000a6 44109 1727204263.89277: done sending task result for task 028d2410-947f-ed67-a560-0000000000a6 44109 1727204263.89284: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 44109 1727204263.89350: no more pending results, returning what we have 44109 1727204263.89354: results queue empty 44109 1727204263.89355: checking for any_errors_fatal 44109 1727204263.89363: done checking for any_errors_fatal 44109 1727204263.89364: checking for max_fail_percentage 44109 1727204263.89366: done checking for max_fail_percentage 44109 1727204263.89367: checking to see if all hosts have failed and the running result is not ok 44109 1727204263.89368: done checking to see if all hosts have failed 44109 1727204263.89369: getting the remaining hosts for this loop 44109 1727204263.89370: done getting the remaining hosts for this loop 44109 1727204263.89375: getting the next task for host managed-node1 44109 1727204263.89386: done getting next task for host managed-node1 44109 1727204263.89388: ^ task is: TASK: meta (role_complete) 44109 1727204263.89390: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204263.89400: getting variables 44109 1727204263.89402: in VariableManager get_vars() 44109 1727204263.89444: Calling all_inventory to load vars for managed-node1 44109 1727204263.89446: Calling groups_inventory to load vars for managed-node1 44109 1727204263.89449: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204263.89459: Calling all_plugins_play to load vars for managed-node1 44109 1727204263.89462: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204263.89465: Calling groups_plugins_play to load vars for managed-node1 44109 1727204263.92736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204263.95950: done with get_vars() 44109 1727204263.96185: done getting variables 44109 1727204263.96266: done queuing things up, now waiting for results queue to drain 44109 1727204263.96269: results queue empty 44109 1727204263.96270: checking for any_errors_fatal 44109 1727204263.96273: done checking for any_errors_fatal 44109 1727204263.96273: checking for max_fail_percentage 44109 1727204263.96274: done checking for max_fail_percentage 44109 1727204263.96277: checking to see if all hosts have failed and the running result is not ok 44109 1727204263.96278: done checking to see if all hosts have failed 44109 1727204263.96279: getting the remaining hosts for this loop 44109 1727204263.96280: done getting the remaining hosts for this loop 44109 1727204263.96283: getting the next task for host managed-node1 44109 1727204263.96286: done getting next task for host managed-node1 44109 1727204263.96288: ^ task is: TASK: meta (flush_handlers) 44109 1727204263.96289: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204263.96292: getting variables 44109 1727204263.96294: in VariableManager get_vars() 44109 1727204263.96308: Calling all_inventory to load vars for managed-node1 44109 1727204263.96310: Calling groups_inventory to load vars for managed-node1 44109 1727204263.96314: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204263.96320: Calling all_plugins_play to load vars for managed-node1 44109 1727204263.96322: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204263.96325: Calling groups_plugins_play to load vars for managed-node1 44109 1727204263.98747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204264.02179: done with get_vars() 44109 1727204264.02209: done getting variables 44109 1727204264.02264: in VariableManager get_vars() 44109 1727204264.02280: Calling all_inventory to load vars for managed-node1 44109 1727204264.02282: Calling groups_inventory to load vars for managed-node1 44109 1727204264.02284: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204264.02289: Calling all_plugins_play to load vars for managed-node1 44109 1727204264.02291: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204264.02294: Calling groups_plugins_play to load vars for managed-node1 44109 1727204264.05004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204264.09974: done with get_vars() 44109 1727204264.10423: done queuing things up, now waiting for results queue to drain 44109 1727204264.10426: results queue empty 44109 1727204264.10427: checking for any_errors_fatal 44109 1727204264.10428: done checking for any_errors_fatal 44109 1727204264.10429: checking for max_fail_percentage 44109 1727204264.10430: done checking for max_fail_percentage 44109 1727204264.10431: checking to see if all hosts have failed and the running result is not ok 44109 1727204264.10431: done checking to see if all hosts have failed 44109 1727204264.10432: getting the remaining hosts for this loop 44109 1727204264.10433: done getting the remaining hosts for this loop 44109 1727204264.10436: getting the next task for host managed-node1 44109 1727204264.10440: done getting next task for host managed-node1 44109 1727204264.10441: ^ task is: TASK: meta (flush_handlers) 44109 1727204264.10443: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204264.10446: getting variables 44109 1727204264.10447: in VariableManager get_vars() 44109 1727204264.10460: Calling all_inventory to load vars for managed-node1 44109 1727204264.10468: Calling groups_inventory to load vars for managed-node1 44109 1727204264.10470: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204264.10478: Calling all_plugins_play to load vars for managed-node1 44109 1727204264.10481: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204264.10484: Calling groups_plugins_play to load vars for managed-node1 44109 1727204264.13648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204264.16978: done with get_vars() 44109 1727204264.17002: done getting variables 44109 1727204264.17058: in VariableManager get_vars() 44109 1727204264.17072: Calling all_inventory to load vars for managed-node1 44109 1727204264.17074: Calling groups_inventory to load vars for managed-node1 44109 1727204264.17078: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204264.17083: Calling all_plugins_play to load vars for managed-node1 44109 1727204264.17085: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204264.17088: Calling groups_plugins_play to load vars for managed-node1 44109 1727204264.18259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204264.19899: done with get_vars() 44109 1727204264.19938: done queuing things up, now waiting for results queue to drain 44109 1727204264.19940: results queue empty 44109 1727204264.19941: checking for any_errors_fatal 44109 1727204264.19943: done checking for any_errors_fatal 44109 1727204264.19943: checking for max_fail_percentage 44109 1727204264.19944: done checking for max_fail_percentage 44109 1727204264.19945: checking to see if all hosts have failed and the running result is not ok 44109 1727204264.19946: done checking to see if all hosts have failed 44109 1727204264.19947: getting the remaining hosts for this loop 44109 1727204264.19948: done getting the remaining hosts for this loop 44109 1727204264.19951: getting the next task for host managed-node1 44109 1727204264.19954: done getting next task for host managed-node1 44109 1727204264.19955: ^ task is: None 44109 1727204264.19956: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204264.19957: done queuing things up, now waiting for results queue to drain 44109 1727204264.19958: results queue empty 44109 1727204264.19959: checking for any_errors_fatal 44109 1727204264.19960: done checking for any_errors_fatal 44109 1727204264.19960: checking for max_fail_percentage 44109 1727204264.19961: done checking for max_fail_percentage 44109 1727204264.19962: checking to see if all hosts have failed and the running result is not ok 44109 1727204264.19962: done checking to see if all hosts have failed 44109 1727204264.19964: getting the next task for host managed-node1 44109 1727204264.19966: done getting next task for host managed-node1 44109 1727204264.19966: ^ task is: None 44109 1727204264.19968: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204264.20019: in VariableManager get_vars() 44109 1727204264.20036: done with get_vars() 44109 1727204264.20042: in VariableManager get_vars() 44109 1727204264.20051: done with get_vars() 44109 1727204264.20054: variable 'omit' from source: magic vars 44109 1727204264.20088: in VariableManager get_vars() 44109 1727204264.20097: done with get_vars() 44109 1727204264.20121: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 44109 1727204264.20303: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44109 1727204264.20331: getting the remaining hosts for this loop 44109 1727204264.20332: done getting the remaining hosts for this loop 44109 1727204264.20335: getting the next task for host managed-node1 44109 1727204264.20337: done getting next task for host managed-node1 44109 1727204264.20339: ^ task is: TASK: Gathering Facts 44109 1727204264.20341: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204264.20343: getting variables 44109 1727204264.20344: in VariableManager get_vars() 44109 1727204264.20354: Calling all_inventory to load vars for managed-node1 44109 1727204264.20356: Calling groups_inventory to load vars for managed-node1 44109 1727204264.20358: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204264.20365: Calling all_plugins_play to load vars for managed-node1 44109 1727204264.20367: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204264.20370: Calling groups_plugins_play to load vars for managed-node1 44109 1727204264.21671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204264.24061: done with get_vars() 44109 1727204264.24096: done getting variables 44109 1727204264.24149: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.719) 0:00:41.038 ***** 44109 1727204264.24383: entering _queue_task() for managed-node1/gather_facts 44109 1727204264.24853: worker is 1 (out of 1 available) 44109 1727204264.24864: exiting _queue_task() for managed-node1/gather_facts 44109 1727204264.25119: done queuing things up, now waiting for results queue to drain 44109 1727204264.25121: waiting for pending results... 44109 1727204264.25695: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44109 1727204264.25700: in run() - task 028d2410-947f-ed67-a560-00000000066a 44109 1727204264.25704: variable 'ansible_search_path' from source: unknown 44109 1727204264.26114: calling self._execute() 44109 1727204264.26586: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204264.26590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204264.26593: variable 'omit' from source: magic vars 44109 1727204264.27232: variable 'ansible_distribution_major_version' from source: facts 44109 1727204264.27252: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204264.27265: variable 'omit' from source: magic vars 44109 1727204264.27305: variable 'omit' from source: magic vars 44109 1727204264.27352: variable 'omit' from source: magic vars 44109 1727204264.27408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204264.27451: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204264.27482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204264.27508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204264.27528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204264.27564: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204264.27572: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204264.27585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204264.27698: Set connection var ansible_connection to ssh 44109 1727204264.27718: Set connection var ansible_timeout to 10 44109 1727204264.27728: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204264.27739: Set connection var ansible_pipelining to False 44109 1727204264.27747: Set connection var ansible_shell_executable to /bin/sh 44109 1727204264.27755: Set connection var ansible_shell_type to sh 44109 1727204264.27783: variable 'ansible_shell_executable' from source: unknown 44109 1727204264.27790: variable 'ansible_connection' from source: unknown 44109 1727204264.27801: variable 'ansible_module_compression' from source: unknown 44109 1727204264.27808: variable 'ansible_shell_type' from source: unknown 44109 1727204264.27818: variable 'ansible_shell_executable' from source: unknown 44109 1727204264.27909: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204264.27914: variable 'ansible_pipelining' from source: unknown 44109 1727204264.27917: variable 'ansible_timeout' from source: unknown 44109 1727204264.27919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204264.28062: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204264.28081: variable 'omit' from source: magic vars 44109 1727204264.28091: starting attempt loop 44109 1727204264.28097: running the handler 44109 1727204264.28120: variable 'ansible_facts' from source: unknown 44109 1727204264.28154: _low_level_execute_command(): starting 44109 1727204264.28166: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204264.29008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204264.29028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204264.29126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204264.29159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204264.29179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204264.29450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204264.29579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204264.31336: stdout chunk (state=3): >>>/root <<< 44109 1727204264.31427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204264.31478: stderr chunk (state=3): >>><<< 44109 1727204264.31481: stdout chunk (state=3): >>><<< 44109 1727204264.31503: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204264.31605: _low_level_execute_command(): starting 44109 1727204264.31609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391 `" && echo ansible-tmp-1727204264.3151467-46875-12091526719391="` echo /root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391 `" ) && sleep 0' 44109 1727204264.32140: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204264.32153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204264.32167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204264.32187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204264.32284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204264.32317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204264.32430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204264.34528: stdout chunk (state=3): >>>ansible-tmp-1727204264.3151467-46875-12091526719391=/root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391 <<< 44109 1727204264.34708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204264.34712: stdout chunk (state=3): >>><<< 44109 1727204264.34714: stderr chunk (state=3): >>><<< 44109 1727204264.34882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204264.3151467-46875-12091526719391=/root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204264.34886: variable 'ansible_module_compression' from source: unknown 44109 1727204264.34888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44109 1727204264.34899: variable 'ansible_facts' from source: unknown 44109 1727204264.35295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/AnsiballZ_setup.py 44109 1727204264.35955: Sending initial data 44109 1727204264.35958: Sent initial data (153 bytes) 44109 1727204264.36607: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204264.36698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204264.36741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204264.36796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204264.36936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204264.38650: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44109 1727204264.38661: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44109 1727204264.38670: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 44109 1727204264.38682: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 44109 1727204264.38694: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 44109 1727204264.38712: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204264.39008: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204264.39083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp7ibues5u /root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/AnsiballZ_setup.py <<< 44109 1727204264.39087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/AnsiballZ_setup.py" <<< 44109 1727204264.39180: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp7ibues5u" to remote "/root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/AnsiballZ_setup.py" <<< 44109 1727204264.41901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204264.41983: stderr chunk (state=3): >>><<< 44109 1727204264.41991: stdout chunk (state=3): >>><<< 44109 1727204264.42017: done transferring module to remote 44109 1727204264.42031: _low_level_execute_command(): starting 44109 1727204264.42039: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/ /root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/AnsiballZ_setup.py && sleep 0' 44109 1727204264.42715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204264.42752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204264.42764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204264.42869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204264.42899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204264.43020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204264.45031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204264.45035: stdout chunk (state=3): >>><<< 44109 1727204264.45042: stderr chunk (state=3): >>><<< 44109 1727204264.45181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204264.45185: _low_level_execute_command(): starting 44109 1727204264.45187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/AnsiballZ_setup.py && sleep 0' 44109 1727204264.45710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204264.45719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204264.45732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204264.45753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204264.45765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204264.45788: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204264.45855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204264.45877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204264.45895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204264.45915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204264.46021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204265.12674: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "44", "epoch": "1727204264", "epoch_int": "1727204264", "date": "2024-09-24", "time": "14:57:44", "iso8601_micro": "2024-09-24T18:57:44.748757Z", "iso8601": "2024-09-24T18:57:44Z", "iso8601_basic": "20240924T145744748757", "iso8601_basic_short": "20240924T145744", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3269, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 856, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261781151744, "block_size": 4096, "block_total": 65519099, "block_available": 63911414, "block_used": 1607685, "inode_total": 131070960, "inode_available": 131027258, "inode_used": 43702, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.595703125, "5m": 0.54052734375, "15m": 0.31884765625}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44109 1727204265.15086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204265.15091: stderr chunk (state=3): >>><<< 44109 1727204265.15093: stdout chunk (state=3): >>><<< 44109 1727204265.15097: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "44", "epoch": "1727204264", "epoch_int": "1727204264", "date": "2024-09-24", "time": "14:57:44", "iso8601_micro": "2024-09-24T18:57:44.748757Z", "iso8601": "2024-09-24T18:57:44Z", "iso8601_basic": "20240924T145744748757", "iso8601_basic_short": "20240924T145744", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3269, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 856, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261781151744, "block_size": 4096, "block_total": 65519099, "block_available": 63911414, "block_used": 1607685, "inode_total": 131070960, "inode_available": 131027258, "inode_used": 43702, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.595703125, "5m": 0.54052734375, "15m": 0.31884765625}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204265.15885: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204265.15904: _low_level_execute_command(): starting 44109 1727204265.15921: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204264.3151467-46875-12091526719391/ > /dev/null 2>&1 && sleep 0' 44109 1727204265.17432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204265.17510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204265.19567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204265.19593: stdout chunk (state=3): >>><<< 44109 1727204265.19607: stderr chunk (state=3): >>><<< 44109 1727204265.19632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204265.19646: handler run complete 44109 1727204265.19788: variable 'ansible_facts' from source: unknown 44109 1727204265.19923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.20269: variable 'ansible_facts' from source: unknown 44109 1727204265.20369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.20515: attempt loop complete, returning result 44109 1727204265.20526: _execute() done 44109 1727204265.20533: dumping result to json 44109 1727204265.20577: done dumping result, returning 44109 1727204265.20591: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-ed67-a560-00000000066a] 44109 1727204265.20600: sending task result for task 028d2410-947f-ed67-a560-00000000066a 44109 1727204265.21248: done sending task result for task 028d2410-947f-ed67-a560-00000000066a 44109 1727204265.21251: WORKER PROCESS EXITING ok: [managed-node1] 44109 1727204265.21639: no more pending results, returning what we have 44109 1727204265.21642: results queue empty 44109 1727204265.21643: checking for any_errors_fatal 44109 1727204265.21644: done checking for any_errors_fatal 44109 1727204265.21645: checking for max_fail_percentage 44109 1727204265.21647: done checking for max_fail_percentage 44109 1727204265.21648: checking to see if all hosts have failed and the running result is not ok 44109 1727204265.21648: done checking to see if all hosts have failed 44109 1727204265.21649: getting the remaining hosts for this loop 44109 1727204265.21650: done getting the remaining hosts for this loop 44109 1727204265.21654: getting the next task for host managed-node1 44109 1727204265.21659: done getting next task for host managed-node1 44109 1727204265.21661: ^ task is: TASK: meta (flush_handlers) 44109 1727204265.21662: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204265.21666: getting variables 44109 1727204265.21668: in VariableManager get_vars() 44109 1727204265.21770: Calling all_inventory to load vars for managed-node1 44109 1727204265.21774: Calling groups_inventory to load vars for managed-node1 44109 1727204265.21780: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.21797: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.21800: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.21803: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.22753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.23737: done with get_vars() 44109 1727204265.23756: done getting variables 44109 1727204265.23824: in VariableManager get_vars() 44109 1727204265.23833: Calling all_inventory to load vars for managed-node1 44109 1727204265.23836: Calling groups_inventory to load vars for managed-node1 44109 1727204265.23839: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.23844: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.23846: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.23849: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.24980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.25859: done with get_vars() 44109 1727204265.25882: done queuing things up, now waiting for results queue to drain 44109 1727204265.25884: results queue empty 44109 1727204265.25884: checking for any_errors_fatal 44109 1727204265.25887: done checking for any_errors_fatal 44109 1727204265.25887: checking for max_fail_percentage 44109 1727204265.25888: done checking for max_fail_percentage 44109 1727204265.25889: checking to see if all hosts have failed and the running result is not ok 44109 1727204265.25889: done checking to see if all hosts have failed 44109 1727204265.25890: getting the remaining hosts for this loop 44109 1727204265.25890: done getting the remaining hosts for this loop 44109 1727204265.25896: getting the next task for host managed-node1 44109 1727204265.25899: done getting next task for host managed-node1 44109 1727204265.25901: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 44109 1727204265.25902: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204265.25904: getting variables 44109 1727204265.25904: in VariableManager get_vars() 44109 1727204265.25914: Calling all_inventory to load vars for managed-node1 44109 1727204265.25916: Calling groups_inventory to load vars for managed-node1 44109 1727204265.25918: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.25923: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.25924: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.25926: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.26619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.28006: done with get_vars() 44109 1727204265.28029: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:230 Tuesday 24 September 2024 14:57:45 -0400 (0:00:01.039) 0:00:42.077 ***** 44109 1727204265.28107: entering _queue_task() for managed-node1/include_tasks 44109 1727204265.28422: worker is 1 (out of 1 available) 44109 1727204265.28434: exiting _queue_task() for managed-node1/include_tasks 44109 1727204265.28445: done queuing things up, now waiting for results queue to drain 44109 1727204265.28446: waiting for pending results... 44109 1727204265.28636: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_absent.yml' 44109 1727204265.28706: in run() - task 028d2410-947f-ed67-a560-0000000000a9 44109 1727204265.28723: variable 'ansible_search_path' from source: unknown 44109 1727204265.28750: calling self._execute() 44109 1727204265.28833: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.28837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.28847: variable 'omit' from source: magic vars 44109 1727204265.29126: variable 'ansible_distribution_major_version' from source: facts 44109 1727204265.29135: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204265.29141: _execute() done 44109 1727204265.29145: dumping result to json 44109 1727204265.29149: done dumping result, returning 44109 1727204265.29154: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_absent.yml' [028d2410-947f-ed67-a560-0000000000a9] 44109 1727204265.29164: sending task result for task 028d2410-947f-ed67-a560-0000000000a9 44109 1727204265.29258: done sending task result for task 028d2410-947f-ed67-a560-0000000000a9 44109 1727204265.29260: WORKER PROCESS EXITING 44109 1727204265.29287: no more pending results, returning what we have 44109 1727204265.29292: in VariableManager get_vars() 44109 1727204265.29325: Calling all_inventory to load vars for managed-node1 44109 1727204265.29327: Calling groups_inventory to load vars for managed-node1 44109 1727204265.29331: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.29342: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.29345: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.29348: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.30489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.31877: done with get_vars() 44109 1727204265.31893: variable 'ansible_search_path' from source: unknown 44109 1727204265.31907: we have included files to process 44109 1727204265.31908: generating all_blocks data 44109 1727204265.31909: done generating all_blocks data 44109 1727204265.31910: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44109 1727204265.31910: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44109 1727204265.31912: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44109 1727204265.32029: in VariableManager get_vars() 44109 1727204265.32041: done with get_vars() 44109 1727204265.32116: done processing included file 44109 1727204265.32118: iterating over new_blocks loaded from include file 44109 1727204265.32119: in VariableManager get_vars() 44109 1727204265.32129: done with get_vars() 44109 1727204265.32130: filtering new block on tags 44109 1727204265.32141: done filtering new block on tags 44109 1727204265.32143: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node1 44109 1727204265.32147: extending task lists for all hosts with included blocks 44109 1727204265.32187: done extending task lists 44109 1727204265.32188: done processing included files 44109 1727204265.32189: results queue empty 44109 1727204265.32189: checking for any_errors_fatal 44109 1727204265.32190: done checking for any_errors_fatal 44109 1727204265.32190: checking for max_fail_percentage 44109 1727204265.32191: done checking for max_fail_percentage 44109 1727204265.32192: checking to see if all hosts have failed and the running result is not ok 44109 1727204265.32192: done checking to see if all hosts have failed 44109 1727204265.32193: getting the remaining hosts for this loop 44109 1727204265.32193: done getting the remaining hosts for this loop 44109 1727204265.32195: getting the next task for host managed-node1 44109 1727204265.32197: done getting next task for host managed-node1 44109 1727204265.32199: ^ task is: TASK: Include the task 'get_profile_stat.yml' 44109 1727204265.32200: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204265.32202: getting variables 44109 1727204265.32203: in VariableManager get_vars() 44109 1727204265.32208: Calling all_inventory to load vars for managed-node1 44109 1727204265.32210: Calling groups_inventory to load vars for managed-node1 44109 1727204265.32211: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.32216: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.32217: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.32219: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.32918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.34097: done with get_vars() 44109 1727204265.34129: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.060) 0:00:42.138 ***** 44109 1727204265.34217: entering _queue_task() for managed-node1/include_tasks 44109 1727204265.34519: worker is 1 (out of 1 available) 44109 1727204265.34531: exiting _queue_task() for managed-node1/include_tasks 44109 1727204265.34547: done queuing things up, now waiting for results queue to drain 44109 1727204265.34549: waiting for pending results... 44109 1727204265.34744: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 44109 1727204265.34842: in run() - task 028d2410-947f-ed67-a560-00000000067b 44109 1727204265.34854: variable 'ansible_search_path' from source: unknown 44109 1727204265.34859: variable 'ansible_search_path' from source: unknown 44109 1727204265.34893: calling self._execute() 44109 1727204265.34963: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.34966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.34977: variable 'omit' from source: magic vars 44109 1727204265.35262: variable 'ansible_distribution_major_version' from source: facts 44109 1727204265.35272: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204265.35279: _execute() done 44109 1727204265.35281: dumping result to json 44109 1727204265.35285: done dumping result, returning 44109 1727204265.35290: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-ed67-a560-00000000067b] 44109 1727204265.35295: sending task result for task 028d2410-947f-ed67-a560-00000000067b 44109 1727204265.35385: done sending task result for task 028d2410-947f-ed67-a560-00000000067b 44109 1727204265.35388: WORKER PROCESS EXITING 44109 1727204265.35441: no more pending results, returning what we have 44109 1727204265.35446: in VariableManager get_vars() 44109 1727204265.35481: Calling all_inventory to load vars for managed-node1 44109 1727204265.35484: Calling groups_inventory to load vars for managed-node1 44109 1727204265.35488: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.35499: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.35502: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.35506: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.36336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.37907: done with get_vars() 44109 1727204265.37923: variable 'ansible_search_path' from source: unknown 44109 1727204265.37924: variable 'ansible_search_path' from source: unknown 44109 1727204265.37951: we have included files to process 44109 1727204265.37952: generating all_blocks data 44109 1727204265.37953: done generating all_blocks data 44109 1727204265.37954: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44109 1727204265.37955: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44109 1727204265.37957: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44109 1727204265.38667: done processing included file 44109 1727204265.38668: iterating over new_blocks loaded from include file 44109 1727204265.38669: in VariableManager get_vars() 44109 1727204265.38681: done with get_vars() 44109 1727204265.38682: filtering new block on tags 44109 1727204265.38696: done filtering new block on tags 44109 1727204265.38698: in VariableManager get_vars() 44109 1727204265.38707: done with get_vars() 44109 1727204265.38707: filtering new block on tags 44109 1727204265.38722: done filtering new block on tags 44109 1727204265.38723: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 44109 1727204265.38727: extending task lists for all hosts with included blocks 44109 1727204265.38789: done extending task lists 44109 1727204265.38790: done processing included files 44109 1727204265.38790: results queue empty 44109 1727204265.38791: checking for any_errors_fatal 44109 1727204265.38793: done checking for any_errors_fatal 44109 1727204265.38793: checking for max_fail_percentage 44109 1727204265.38794: done checking for max_fail_percentage 44109 1727204265.38794: checking to see if all hosts have failed and the running result is not ok 44109 1727204265.38795: done checking to see if all hosts have failed 44109 1727204265.38795: getting the remaining hosts for this loop 44109 1727204265.38796: done getting the remaining hosts for this loop 44109 1727204265.38798: getting the next task for host managed-node1 44109 1727204265.38801: done getting next task for host managed-node1 44109 1727204265.38802: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 44109 1727204265.38804: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204265.38805: getting variables 44109 1727204265.38806: in VariableManager get_vars() 44109 1727204265.38848: Calling all_inventory to load vars for managed-node1 44109 1727204265.38850: Calling groups_inventory to load vars for managed-node1 44109 1727204265.38852: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.38857: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.38859: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.38862: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.39649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.41169: done with get_vars() 44109 1727204265.41197: done getting variables 44109 1727204265.41242: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.070) 0:00:42.209 ***** 44109 1727204265.41271: entering _queue_task() for managed-node1/set_fact 44109 1727204265.41629: worker is 1 (out of 1 available) 44109 1727204265.41641: exiting _queue_task() for managed-node1/set_fact 44109 1727204265.41653: done queuing things up, now waiting for results queue to drain 44109 1727204265.41654: waiting for pending results... 44109 1727204265.41984: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 44109 1727204265.42060: in run() - task 028d2410-947f-ed67-a560-00000000068a 44109 1727204265.42089: variable 'ansible_search_path' from source: unknown 44109 1727204265.42097: variable 'ansible_search_path' from source: unknown 44109 1727204265.42139: calling self._execute() 44109 1727204265.42234: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.42245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.42257: variable 'omit' from source: magic vars 44109 1727204265.42632: variable 'ansible_distribution_major_version' from source: facts 44109 1727204265.42881: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204265.42884: variable 'omit' from source: magic vars 44109 1727204265.42886: variable 'omit' from source: magic vars 44109 1727204265.42889: variable 'omit' from source: magic vars 44109 1727204265.42892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204265.42894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204265.42896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204265.42898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204265.42900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204265.42922: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204265.42932: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.42941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.43053: Set connection var ansible_connection to ssh 44109 1727204265.43066: Set connection var ansible_timeout to 10 44109 1727204265.43078: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204265.43091: Set connection var ansible_pipelining to False 44109 1727204265.43099: Set connection var ansible_shell_executable to /bin/sh 44109 1727204265.43109: Set connection var ansible_shell_type to sh 44109 1727204265.43139: variable 'ansible_shell_executable' from source: unknown 44109 1727204265.43146: variable 'ansible_connection' from source: unknown 44109 1727204265.43153: variable 'ansible_module_compression' from source: unknown 44109 1727204265.43159: variable 'ansible_shell_type' from source: unknown 44109 1727204265.43165: variable 'ansible_shell_executable' from source: unknown 44109 1727204265.43171: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.43181: variable 'ansible_pipelining' from source: unknown 44109 1727204265.43187: variable 'ansible_timeout' from source: unknown 44109 1727204265.43197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.43342: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204265.43361: variable 'omit' from source: magic vars 44109 1727204265.43371: starting attempt loop 44109 1727204265.43381: running the handler 44109 1727204265.43398: handler run complete 44109 1727204265.43414: attempt loop complete, returning result 44109 1727204265.43427: _execute() done 44109 1727204265.43435: dumping result to json 44109 1727204265.43441: done dumping result, returning 44109 1727204265.43457: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-ed67-a560-00000000068a] 44109 1727204265.43465: sending task result for task 028d2410-947f-ed67-a560-00000000068a ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 44109 1727204265.43678: no more pending results, returning what we have 44109 1727204265.43683: results queue empty 44109 1727204265.43684: checking for any_errors_fatal 44109 1727204265.43685: done checking for any_errors_fatal 44109 1727204265.43686: checking for max_fail_percentage 44109 1727204265.43687: done checking for max_fail_percentage 44109 1727204265.43688: checking to see if all hosts have failed and the running result is not ok 44109 1727204265.43689: done checking to see if all hosts have failed 44109 1727204265.43690: getting the remaining hosts for this loop 44109 1727204265.43691: done getting the remaining hosts for this loop 44109 1727204265.43695: getting the next task for host managed-node1 44109 1727204265.43702: done getting next task for host managed-node1 44109 1727204265.43705: ^ task is: TASK: Stat profile file 44109 1727204265.43709: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204265.43715: getting variables 44109 1727204265.43717: in VariableManager get_vars() 44109 1727204265.43745: Calling all_inventory to load vars for managed-node1 44109 1727204265.43748: Calling groups_inventory to load vars for managed-node1 44109 1727204265.43751: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.43764: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.43767: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.43770: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.44393: done sending task result for task 028d2410-947f-ed67-a560-00000000068a 44109 1727204265.44397: WORKER PROCESS EXITING 44109 1727204265.45449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.47140: done with get_vars() 44109 1727204265.47166: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.060) 0:00:42.269 ***** 44109 1727204265.47276: entering _queue_task() for managed-node1/stat 44109 1727204265.47667: worker is 1 (out of 1 available) 44109 1727204265.47686: exiting _queue_task() for managed-node1/stat 44109 1727204265.47699: done queuing things up, now waiting for results queue to drain 44109 1727204265.47700: waiting for pending results... 44109 1727204265.47997: running TaskExecutor() for managed-node1/TASK: Stat profile file 44109 1727204265.48202: in run() - task 028d2410-947f-ed67-a560-00000000068b 44109 1727204265.48206: variable 'ansible_search_path' from source: unknown 44109 1727204265.48209: variable 'ansible_search_path' from source: unknown 44109 1727204265.48212: calling self._execute() 44109 1727204265.48316: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.48320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.48328: variable 'omit' from source: magic vars 44109 1727204265.48651: variable 'ansible_distribution_major_version' from source: facts 44109 1727204265.48661: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204265.48666: variable 'omit' from source: magic vars 44109 1727204265.48702: variable 'omit' from source: magic vars 44109 1727204265.48774: variable 'profile' from source: include params 44109 1727204265.48779: variable 'interface' from source: set_fact 44109 1727204265.48832: variable 'interface' from source: set_fact 44109 1727204265.48845: variable 'omit' from source: magic vars 44109 1727204265.48881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204265.48909: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204265.48927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204265.48941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204265.48950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204265.48979: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204265.48982: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.48985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.49057: Set connection var ansible_connection to ssh 44109 1727204265.49060: Set connection var ansible_timeout to 10 44109 1727204265.49072: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204265.49075: Set connection var ansible_pipelining to False 44109 1727204265.49079: Set connection var ansible_shell_executable to /bin/sh 44109 1727204265.49084: Set connection var ansible_shell_type to sh 44109 1727204265.49101: variable 'ansible_shell_executable' from source: unknown 44109 1727204265.49103: variable 'ansible_connection' from source: unknown 44109 1727204265.49107: variable 'ansible_module_compression' from source: unknown 44109 1727204265.49111: variable 'ansible_shell_type' from source: unknown 44109 1727204265.49113: variable 'ansible_shell_executable' from source: unknown 44109 1727204265.49115: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.49120: variable 'ansible_pipelining' from source: unknown 44109 1727204265.49123: variable 'ansible_timeout' from source: unknown 44109 1727204265.49126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.49273: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204265.49285: variable 'omit' from source: magic vars 44109 1727204265.49288: starting attempt loop 44109 1727204265.49290: running the handler 44109 1727204265.49307: _low_level_execute_command(): starting 44109 1727204265.49313: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204265.49824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204265.49827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.49832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204265.49836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.49885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204265.49888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204265.49891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204265.49981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204265.51768: stdout chunk (state=3): >>>/root <<< 44109 1727204265.51870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204265.51900: stderr chunk (state=3): >>><<< 44109 1727204265.51905: stdout chunk (state=3): >>><<< 44109 1727204265.51930: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204265.51941: _low_level_execute_command(): starting 44109 1727204265.51947: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504 `" && echo ansible-tmp-1727204265.5192974-46942-176819307588504="` echo /root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504 `" ) && sleep 0' 44109 1727204265.52357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204265.52362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204265.52394: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 44109 1727204265.52398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.52454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204265.52458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204265.52460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204265.52541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204265.54613: stdout chunk (state=3): >>>ansible-tmp-1727204265.5192974-46942-176819307588504=/root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504 <<< 44109 1727204265.54750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204265.54753: stderr chunk (state=3): >>><<< 44109 1727204265.54754: stdout chunk (state=3): >>><<< 44109 1727204265.54766: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204265.5192974-46942-176819307588504=/root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204265.54819: variable 'ansible_module_compression' from source: unknown 44109 1727204265.54857: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44109 1727204265.54891: variable 'ansible_facts' from source: unknown 44109 1727204265.54943: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/AnsiballZ_stat.py 44109 1727204265.55044: Sending initial data 44109 1727204265.55048: Sent initial data (153 bytes) 44109 1727204265.55713: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.55716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204265.55836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204265.57602: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204265.57802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204265.57892: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmp2i4f0qy6 /root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/AnsiballZ_stat.py <<< 44109 1727204265.57895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/AnsiballZ_stat.py" <<< 44109 1727204265.57960: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmp2i4f0qy6" to remote "/root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/AnsiballZ_stat.py" <<< 44109 1727204265.57972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/AnsiballZ_stat.py" <<< 44109 1727204265.58863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204265.58901: stderr chunk (state=3): >>><<< 44109 1727204265.58904: stdout chunk (state=3): >>><<< 44109 1727204265.58966: done transferring module to remote 44109 1727204265.58979: _low_level_execute_command(): starting 44109 1727204265.58984: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/ /root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/AnsiballZ_stat.py && sleep 0' 44109 1727204265.59631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.59693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204265.59699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204265.59723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204265.59871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204265.61982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204265.61987: stdout chunk (state=3): >>><<< 44109 1727204265.61989: stderr chunk (state=3): >>><<< 44109 1727204265.61992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204265.61994: _low_level_execute_command(): starting 44109 1727204265.61997: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/AnsiballZ_stat.py && sleep 0' 44109 1727204265.62497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204265.62507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204265.62518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204265.62539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204265.62552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204265.62558: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204265.62568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.62584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44109 1727204265.62591: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 44109 1727204265.62598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44109 1727204265.62606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204265.62618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204265.62627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204265.62689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.62717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204265.62727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204265.62750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204265.62864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204265.79190: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44109 1727204265.80784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204265.80789: stdout chunk (state=3): >>><<< 44109 1727204265.80791: stderr chunk (state=3): >>><<< 44109 1727204265.80813: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204265.80857: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204265.80946: _low_level_execute_command(): starting 44109 1727204265.80950: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204265.5192974-46942-176819307588504/ > /dev/null 2>&1 && sleep 0' 44109 1727204265.81524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204265.81539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204265.81556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204265.81583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204265.81603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204265.81626: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.81692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204265.81733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204265.81750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204265.81766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204265.81890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204265.83982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204265.83985: stderr chunk (state=3): >>><<< 44109 1727204265.83988: stdout chunk (state=3): >>><<< 44109 1727204265.83990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204265.83992: handler run complete 44109 1727204265.83994: attempt loop complete, returning result 44109 1727204265.83996: _execute() done 44109 1727204265.83998: dumping result to json 44109 1727204265.84000: done dumping result, returning 44109 1727204265.84002: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-ed67-a560-00000000068b] 44109 1727204265.84005: sending task result for task 028d2410-947f-ed67-a560-00000000068b 44109 1727204265.84096: done sending task result for task 028d2410-947f-ed67-a560-00000000068b 44109 1727204265.84098: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 44109 1727204265.84174: no more pending results, returning what we have 44109 1727204265.84181: results queue empty 44109 1727204265.84182: checking for any_errors_fatal 44109 1727204265.84190: done checking for any_errors_fatal 44109 1727204265.84191: checking for max_fail_percentage 44109 1727204265.84193: done checking for max_fail_percentage 44109 1727204265.84194: checking to see if all hosts have failed and the running result is not ok 44109 1727204265.84195: done checking to see if all hosts have failed 44109 1727204265.84196: getting the remaining hosts for this loop 44109 1727204265.84197: done getting the remaining hosts for this loop 44109 1727204265.84201: getting the next task for host managed-node1 44109 1727204265.84209: done getting next task for host managed-node1 44109 1727204265.84211: ^ task is: TASK: Set NM profile exist flag based on the profile files 44109 1727204265.84219: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204265.84223: getting variables 44109 1727204265.84225: in VariableManager get_vars() 44109 1727204265.84255: Calling all_inventory to load vars for managed-node1 44109 1727204265.84258: Calling groups_inventory to load vars for managed-node1 44109 1727204265.84262: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.84274: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.84484: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.84489: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.85960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.87631: done with get_vars() 44109 1727204265.87664: done getting variables 44109 1727204265.87730: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.404) 0:00:42.673 ***** 44109 1727204265.87770: entering _queue_task() for managed-node1/set_fact 44109 1727204265.88157: worker is 1 (out of 1 available) 44109 1727204265.88169: exiting _queue_task() for managed-node1/set_fact 44109 1727204265.88385: done queuing things up, now waiting for results queue to drain 44109 1727204265.88386: waiting for pending results... 44109 1727204265.88481: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 44109 1727204265.88621: in run() - task 028d2410-947f-ed67-a560-00000000068c 44109 1727204265.88625: variable 'ansible_search_path' from source: unknown 44109 1727204265.88629: variable 'ansible_search_path' from source: unknown 44109 1727204265.88651: calling self._execute() 44109 1727204265.88754: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.88757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.88881: variable 'omit' from source: magic vars 44109 1727204265.89170: variable 'ansible_distribution_major_version' from source: facts 44109 1727204265.89183: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204265.89310: variable 'profile_stat' from source: set_fact 44109 1727204265.89326: Evaluated conditional (profile_stat.stat.exists): False 44109 1727204265.89330: when evaluation is False, skipping this task 44109 1727204265.89333: _execute() done 44109 1727204265.89336: dumping result to json 44109 1727204265.89338: done dumping result, returning 44109 1727204265.89344: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-ed67-a560-00000000068c] 44109 1727204265.89349: sending task result for task 028d2410-947f-ed67-a560-00000000068c 44109 1727204265.89444: done sending task result for task 028d2410-947f-ed67-a560-00000000068c 44109 1727204265.89447: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44109 1727204265.89527: no more pending results, returning what we have 44109 1727204265.89533: results queue empty 44109 1727204265.89534: checking for any_errors_fatal 44109 1727204265.89543: done checking for any_errors_fatal 44109 1727204265.89544: checking for max_fail_percentage 44109 1727204265.89546: done checking for max_fail_percentage 44109 1727204265.89547: checking to see if all hosts have failed and the running result is not ok 44109 1727204265.89548: done checking to see if all hosts have failed 44109 1727204265.89549: getting the remaining hosts for this loop 44109 1727204265.89550: done getting the remaining hosts for this loop 44109 1727204265.89555: getting the next task for host managed-node1 44109 1727204265.89562: done getting next task for host managed-node1 44109 1727204265.89566: ^ task is: TASK: Get NM profile info 44109 1727204265.89571: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204265.89581: getting variables 44109 1727204265.89584: in VariableManager get_vars() 44109 1727204265.89619: Calling all_inventory to load vars for managed-node1 44109 1727204265.89623: Calling groups_inventory to load vars for managed-node1 44109 1727204265.89628: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204265.89643: Calling all_plugins_play to load vars for managed-node1 44109 1727204265.89646: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204265.89649: Calling groups_plugins_play to load vars for managed-node1 44109 1727204265.96532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204265.98557: done with get_vars() 44109 1727204265.98594: done getting variables 44109 1727204265.98684: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.109) 0:00:42.783 ***** 44109 1727204265.98722: entering _queue_task() for managed-node1/shell 44109 1727204265.98723: Creating lock for shell 44109 1727204265.99316: worker is 1 (out of 1 available) 44109 1727204265.99328: exiting _queue_task() for managed-node1/shell 44109 1727204265.99338: done queuing things up, now waiting for results queue to drain 44109 1727204265.99339: waiting for pending results... 44109 1727204265.99630: running TaskExecutor() for managed-node1/TASK: Get NM profile info 44109 1727204265.99713: in run() - task 028d2410-947f-ed67-a560-00000000068d 44109 1727204265.99743: variable 'ansible_search_path' from source: unknown 44109 1727204265.99753: variable 'ansible_search_path' from source: unknown 44109 1727204265.99798: calling self._execute() 44109 1727204265.99912: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204265.99925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204265.99943: variable 'omit' from source: magic vars 44109 1727204266.00357: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.00380: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.00481: variable 'omit' from source: magic vars 44109 1727204266.00487: variable 'omit' from source: magic vars 44109 1727204266.00549: variable 'profile' from source: include params 44109 1727204266.00559: variable 'interface' from source: set_fact 44109 1727204266.00637: variable 'interface' from source: set_fact 44109 1727204266.00663: variable 'omit' from source: magic vars 44109 1727204266.00717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204266.00930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204266.00933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204266.00936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204266.00938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204266.00943: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204266.00953: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.00961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.01070: Set connection var ansible_connection to ssh 44109 1727204266.01085: Set connection var ansible_timeout to 10 44109 1727204266.01097: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204266.01111: Set connection var ansible_pipelining to False 44109 1727204266.01121: Set connection var ansible_shell_executable to /bin/sh 44109 1727204266.01130: Set connection var ansible_shell_type to sh 44109 1727204266.01160: variable 'ansible_shell_executable' from source: unknown 44109 1727204266.01168: variable 'ansible_connection' from source: unknown 44109 1727204266.01174: variable 'ansible_module_compression' from source: unknown 44109 1727204266.01185: variable 'ansible_shell_type' from source: unknown 44109 1727204266.01193: variable 'ansible_shell_executable' from source: unknown 44109 1727204266.01200: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.01473: variable 'ansible_pipelining' from source: unknown 44109 1727204266.01478: variable 'ansible_timeout' from source: unknown 44109 1727204266.01481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.01540: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204266.01611: variable 'omit' from source: magic vars 44109 1727204266.01621: starting attempt loop 44109 1727204266.01628: running the handler 44109 1727204266.01641: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204266.01666: _low_level_execute_command(): starting 44109 1727204266.01720: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204266.02605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204266.02621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204266.02635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204266.02654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204266.02707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204266.02773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204266.02926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.03023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.04867: stdout chunk (state=3): >>>/root <<< 44109 1727204266.04945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.04948: stdout chunk (state=3): >>><<< 44109 1727204266.04951: stderr chunk (state=3): >>><<< 44109 1727204266.04971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204266.04993: _low_level_execute_command(): starting 44109 1727204266.05482: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084 `" && echo ansible-tmp-1727204266.0497987-46967-253662274728084="` echo /root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084 `" ) && sleep 0' 44109 1727204266.06381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204266.06549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204266.06811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.06892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.09001: stdout chunk (state=3): >>>ansible-tmp-1727204266.0497987-46967-253662274728084=/root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084 <<< 44109 1727204266.09147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.09150: stdout chunk (state=3): >>><<< 44109 1727204266.09159: stderr chunk (state=3): >>><<< 44109 1727204266.09185: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204266.0497987-46967-253662274728084=/root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204266.09223: variable 'ansible_module_compression' from source: unknown 44109 1727204266.09281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204266.09328: variable 'ansible_facts' from source: unknown 44109 1727204266.09705: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/AnsiballZ_command.py 44109 1727204266.10081: Sending initial data 44109 1727204266.10084: Sent initial data (156 bytes) 44109 1727204266.11582: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204266.11586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204266.11589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204266.11591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.11593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.13337: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204266.13411: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204266.13489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpw7s4myd0 /root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/AnsiballZ_command.py <<< 44109 1727204266.13500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/AnsiballZ_command.py" <<< 44109 1727204266.13564: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpw7s4myd0" to remote "/root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/AnsiballZ_command.py" <<< 44109 1727204266.14926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.14936: stdout chunk (state=3): >>><<< 44109 1727204266.14946: stderr chunk (state=3): >>><<< 44109 1727204266.15014: done transferring module to remote 44109 1727204266.15094: _low_level_execute_command(): starting 44109 1727204266.15103: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/ /root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/AnsiballZ_command.py && sleep 0' 44109 1727204266.16268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204266.16284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204266.16368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204266.16417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204266.16480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204266.16573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.16670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.18714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.18883: stderr chunk (state=3): >>><<< 44109 1727204266.18887: stdout chunk (state=3): >>><<< 44109 1727204266.18890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204266.18892: _low_level_execute_command(): starting 44109 1727204266.18895: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/AnsiballZ_command.py && sleep 0' 44109 1727204266.20480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204266.20697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.20827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.38960: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 14:57:46.370805", "end": "2024-09-24 14:57:46.387757", "delta": "0:00:00.016952", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204266.40706: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. <<< 44109 1727204266.40731: stdout chunk (state=3): >>><<< 44109 1727204266.40750: stderr chunk (state=3): >>><<< 44109 1727204266.40765: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 14:57:46.370805", "end": "2024-09-24 14:57:46.387757", "delta": "0:00:00.016952", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. 44109 1727204266.40793: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204266.40803: _low_level_execute_command(): starting 44109 1727204266.40808: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204266.0497987-46967-253662274728084/ > /dev/null 2>&1 && sleep 0' 44109 1727204266.41361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.41428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.43394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.43419: stderr chunk (state=3): >>><<< 44109 1727204266.43422: stdout chunk (state=3): >>><<< 44109 1727204266.43438: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204266.43444: handler run complete 44109 1727204266.43461: Evaluated conditional (False): False 44109 1727204266.43469: attempt loop complete, returning result 44109 1727204266.43471: _execute() done 44109 1727204266.43474: dumping result to json 44109 1727204266.43480: done dumping result, returning 44109 1727204266.43487: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-ed67-a560-00000000068d] 44109 1727204266.43489: sending task result for task 028d2410-947f-ed67-a560-00000000068d fatal: [managed-node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.016952", "end": "2024-09-24 14:57:46.387757", "rc": 1, "start": "2024-09-24 14:57:46.370805" } MSG: non-zero return code ...ignoring 44109 1727204266.43650: no more pending results, returning what we have 44109 1727204266.43654: results queue empty 44109 1727204266.43655: checking for any_errors_fatal 44109 1727204266.43662: done checking for any_errors_fatal 44109 1727204266.43663: checking for max_fail_percentage 44109 1727204266.43664: done checking for max_fail_percentage 44109 1727204266.43665: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.43666: done checking to see if all hosts have failed 44109 1727204266.43666: getting the remaining hosts for this loop 44109 1727204266.43668: done getting the remaining hosts for this loop 44109 1727204266.43671: getting the next task for host managed-node1 44109 1727204266.43681: done getting next task for host managed-node1 44109 1727204266.43683: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44109 1727204266.43687: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.43691: getting variables 44109 1727204266.43694: in VariableManager get_vars() 44109 1727204266.43726: Calling all_inventory to load vars for managed-node1 44109 1727204266.43728: Calling groups_inventory to load vars for managed-node1 44109 1727204266.43731: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.43741: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.43744: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.43747: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.44289: done sending task result for task 028d2410-947f-ed67-a560-00000000068d 44109 1727204266.44293: WORKER PROCESS EXITING 44109 1727204266.44583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.45469: done with get_vars() 44109 1727204266.45489: done getting variables 44109 1727204266.45536: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.468) 0:00:43.251 ***** 44109 1727204266.45559: entering _queue_task() for managed-node1/set_fact 44109 1727204266.45805: worker is 1 (out of 1 available) 44109 1727204266.45820: exiting _queue_task() for managed-node1/set_fact 44109 1727204266.45832: done queuing things up, now waiting for results queue to drain 44109 1727204266.45833: waiting for pending results... 44109 1727204266.46002: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44109 1727204266.46092: in run() - task 028d2410-947f-ed67-a560-00000000068e 44109 1727204266.46105: variable 'ansible_search_path' from source: unknown 44109 1727204266.46108: variable 'ansible_search_path' from source: unknown 44109 1727204266.46136: calling self._execute() 44109 1727204266.46216: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.46220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.46226: variable 'omit' from source: magic vars 44109 1727204266.46498: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.46508: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.46595: variable 'nm_profile_exists' from source: set_fact 44109 1727204266.46605: Evaluated conditional (nm_profile_exists.rc == 0): False 44109 1727204266.46609: when evaluation is False, skipping this task 44109 1727204266.46611: _execute() done 44109 1727204266.46617: dumping result to json 44109 1727204266.46619: done dumping result, returning 44109 1727204266.46625: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-ed67-a560-00000000068e] 44109 1727204266.46630: sending task result for task 028d2410-947f-ed67-a560-00000000068e 44109 1727204266.46724: done sending task result for task 028d2410-947f-ed67-a560-00000000068e 44109 1727204266.46726: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 44109 1727204266.46781: no more pending results, returning what we have 44109 1727204266.46785: results queue empty 44109 1727204266.46786: checking for any_errors_fatal 44109 1727204266.46795: done checking for any_errors_fatal 44109 1727204266.46795: checking for max_fail_percentage 44109 1727204266.46797: done checking for max_fail_percentage 44109 1727204266.46798: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.46799: done checking to see if all hosts have failed 44109 1727204266.46800: getting the remaining hosts for this loop 44109 1727204266.46801: done getting the remaining hosts for this loop 44109 1727204266.46804: getting the next task for host managed-node1 44109 1727204266.46817: done getting next task for host managed-node1 44109 1727204266.46820: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 44109 1727204266.46823: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.46826: getting variables 44109 1727204266.46828: in VariableManager get_vars() 44109 1727204266.46852: Calling all_inventory to load vars for managed-node1 44109 1727204266.46854: Calling groups_inventory to load vars for managed-node1 44109 1727204266.46857: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.46866: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.46869: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.46871: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.47774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.48838: done with get_vars() 44109 1727204266.48861: done getting variables 44109 1727204266.48918: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204266.49030: variable 'profile' from source: include params 44109 1727204266.49034: variable 'interface' from source: set_fact 44109 1727204266.49091: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.035) 0:00:43.287 ***** 44109 1727204266.49124: entering _queue_task() for managed-node1/command 44109 1727204266.49440: worker is 1 (out of 1 available) 44109 1727204266.49452: exiting _queue_task() for managed-node1/command 44109 1727204266.49462: done queuing things up, now waiting for results queue to drain 44109 1727204266.49463: waiting for pending results... 44109 1727204266.49890: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 44109 1727204266.49895: in run() - task 028d2410-947f-ed67-a560-000000000690 44109 1727204266.49898: variable 'ansible_search_path' from source: unknown 44109 1727204266.49900: variable 'ansible_search_path' from source: unknown 44109 1727204266.49908: calling self._execute() 44109 1727204266.50004: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.50020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.50032: variable 'omit' from source: magic vars 44109 1727204266.50377: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.50387: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.50470: variable 'profile_stat' from source: set_fact 44109 1727204266.50481: Evaluated conditional (profile_stat.stat.exists): False 44109 1727204266.50485: when evaluation is False, skipping this task 44109 1727204266.50488: _execute() done 44109 1727204266.50490: dumping result to json 44109 1727204266.50493: done dumping result, returning 44109 1727204266.50500: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [028d2410-947f-ed67-a560-000000000690] 44109 1727204266.50505: sending task result for task 028d2410-947f-ed67-a560-000000000690 44109 1727204266.50597: done sending task result for task 028d2410-947f-ed67-a560-000000000690 44109 1727204266.50599: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44109 1727204266.50648: no more pending results, returning what we have 44109 1727204266.50652: results queue empty 44109 1727204266.50653: checking for any_errors_fatal 44109 1727204266.50658: done checking for any_errors_fatal 44109 1727204266.50659: checking for max_fail_percentage 44109 1727204266.50661: done checking for max_fail_percentage 44109 1727204266.50662: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.50663: done checking to see if all hosts have failed 44109 1727204266.50663: getting the remaining hosts for this loop 44109 1727204266.50665: done getting the remaining hosts for this loop 44109 1727204266.50669: getting the next task for host managed-node1 44109 1727204266.50677: done getting next task for host managed-node1 44109 1727204266.50680: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 44109 1727204266.50684: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.50688: getting variables 44109 1727204266.50689: in VariableManager get_vars() 44109 1727204266.50722: Calling all_inventory to load vars for managed-node1 44109 1727204266.50725: Calling groups_inventory to load vars for managed-node1 44109 1727204266.50728: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.50739: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.50742: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.50744: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.51531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.52520: done with get_vars() 44109 1727204266.52537: done getting variables 44109 1727204266.52583: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204266.52664: variable 'profile' from source: include params 44109 1727204266.52667: variable 'interface' from source: set_fact 44109 1727204266.52709: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.036) 0:00:43.323 ***** 44109 1727204266.52732: entering _queue_task() for managed-node1/set_fact 44109 1727204266.52983: worker is 1 (out of 1 available) 44109 1727204266.52995: exiting _queue_task() for managed-node1/set_fact 44109 1727204266.53008: done queuing things up, now waiting for results queue to drain 44109 1727204266.53009: waiting for pending results... 44109 1727204266.53189: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 44109 1727204266.53278: in run() - task 028d2410-947f-ed67-a560-000000000691 44109 1727204266.53289: variable 'ansible_search_path' from source: unknown 44109 1727204266.53292: variable 'ansible_search_path' from source: unknown 44109 1727204266.53324: calling self._execute() 44109 1727204266.53398: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.53402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.53411: variable 'omit' from source: magic vars 44109 1727204266.53678: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.53688: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.53769: variable 'profile_stat' from source: set_fact 44109 1727204266.53782: Evaluated conditional (profile_stat.stat.exists): False 44109 1727204266.53785: when evaluation is False, skipping this task 44109 1727204266.53789: _execute() done 44109 1727204266.53791: dumping result to json 44109 1727204266.53794: done dumping result, returning 44109 1727204266.53800: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [028d2410-947f-ed67-a560-000000000691] 44109 1727204266.53805: sending task result for task 028d2410-947f-ed67-a560-000000000691 44109 1727204266.53890: done sending task result for task 028d2410-947f-ed67-a560-000000000691 44109 1727204266.53893: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44109 1727204266.53938: no more pending results, returning what we have 44109 1727204266.53942: results queue empty 44109 1727204266.53943: checking for any_errors_fatal 44109 1727204266.53950: done checking for any_errors_fatal 44109 1727204266.53951: checking for max_fail_percentage 44109 1727204266.53952: done checking for max_fail_percentage 44109 1727204266.53953: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.53954: done checking to see if all hosts have failed 44109 1727204266.53955: getting the remaining hosts for this loop 44109 1727204266.53956: done getting the remaining hosts for this loop 44109 1727204266.53960: getting the next task for host managed-node1 44109 1727204266.53967: done getting next task for host managed-node1 44109 1727204266.53970: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 44109 1727204266.53973: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.53979: getting variables 44109 1727204266.53981: in VariableManager get_vars() 44109 1727204266.54010: Calling all_inventory to load vars for managed-node1 44109 1727204266.54012: Calling groups_inventory to load vars for managed-node1 44109 1727204266.54016: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.54027: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.54030: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.54033: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.54849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.55730: done with get_vars() 44109 1727204266.55749: done getting variables 44109 1727204266.55793: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204266.55874: variable 'profile' from source: include params 44109 1727204266.55879: variable 'interface' from source: set_fact 44109 1727204266.55920: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.032) 0:00:43.355 ***** 44109 1727204266.55943: entering _queue_task() for managed-node1/command 44109 1727204266.56192: worker is 1 (out of 1 available) 44109 1727204266.56207: exiting _queue_task() for managed-node1/command 44109 1727204266.56217: done queuing things up, now waiting for results queue to drain 44109 1727204266.56218: waiting for pending results... 44109 1727204266.56395: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 44109 1727204266.56484: in run() - task 028d2410-947f-ed67-a560-000000000692 44109 1727204266.56494: variable 'ansible_search_path' from source: unknown 44109 1727204266.56497: variable 'ansible_search_path' from source: unknown 44109 1727204266.56529: calling self._execute() 44109 1727204266.56603: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.56607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.56618: variable 'omit' from source: magic vars 44109 1727204266.56881: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.56889: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.56972: variable 'profile_stat' from source: set_fact 44109 1727204266.56985: Evaluated conditional (profile_stat.stat.exists): False 44109 1727204266.56989: when evaluation is False, skipping this task 44109 1727204266.56992: _execute() done 44109 1727204266.56994: dumping result to json 44109 1727204266.56997: done dumping result, returning 44109 1727204266.57004: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [028d2410-947f-ed67-a560-000000000692] 44109 1727204266.57007: sending task result for task 028d2410-947f-ed67-a560-000000000692 44109 1727204266.57092: done sending task result for task 028d2410-947f-ed67-a560-000000000692 44109 1727204266.57094: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44109 1727204266.57152: no more pending results, returning what we have 44109 1727204266.57156: results queue empty 44109 1727204266.57157: checking for any_errors_fatal 44109 1727204266.57164: done checking for any_errors_fatal 44109 1727204266.57165: checking for max_fail_percentage 44109 1727204266.57166: done checking for max_fail_percentage 44109 1727204266.57168: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.57168: done checking to see if all hosts have failed 44109 1727204266.57169: getting the remaining hosts for this loop 44109 1727204266.57170: done getting the remaining hosts for this loop 44109 1727204266.57174: getting the next task for host managed-node1 44109 1727204266.57184: done getting next task for host managed-node1 44109 1727204266.57186: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 44109 1727204266.57189: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.57193: getting variables 44109 1727204266.57195: in VariableManager get_vars() 44109 1727204266.57221: Calling all_inventory to load vars for managed-node1 44109 1727204266.57224: Calling groups_inventory to load vars for managed-node1 44109 1727204266.57227: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.57238: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.57241: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.57243: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.58163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.59022: done with get_vars() 44109 1727204266.59037: done getting variables 44109 1727204266.59079: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204266.59156: variable 'profile' from source: include params 44109 1727204266.59159: variable 'interface' from source: set_fact 44109 1727204266.59198: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.032) 0:00:43.388 ***** 44109 1727204266.59219: entering _queue_task() for managed-node1/set_fact 44109 1727204266.59449: worker is 1 (out of 1 available) 44109 1727204266.59461: exiting _queue_task() for managed-node1/set_fact 44109 1727204266.59473: done queuing things up, now waiting for results queue to drain 44109 1727204266.59474: waiting for pending results... 44109 1727204266.59649: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 44109 1727204266.59730: in run() - task 028d2410-947f-ed67-a560-000000000693 44109 1727204266.59741: variable 'ansible_search_path' from source: unknown 44109 1727204266.59744: variable 'ansible_search_path' from source: unknown 44109 1727204266.59772: calling self._execute() 44109 1727204266.59850: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.59853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.59862: variable 'omit' from source: magic vars 44109 1727204266.60125: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.60137: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.60219: variable 'profile_stat' from source: set_fact 44109 1727204266.60230: Evaluated conditional (profile_stat.stat.exists): False 44109 1727204266.60233: when evaluation is False, skipping this task 44109 1727204266.60235: _execute() done 44109 1727204266.60240: dumping result to json 44109 1727204266.60242: done dumping result, returning 44109 1727204266.60250: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [028d2410-947f-ed67-a560-000000000693] 44109 1727204266.60253: sending task result for task 028d2410-947f-ed67-a560-000000000693 44109 1727204266.60333: done sending task result for task 028d2410-947f-ed67-a560-000000000693 44109 1727204266.60336: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44109 1727204266.60399: no more pending results, returning what we have 44109 1727204266.60402: results queue empty 44109 1727204266.60403: checking for any_errors_fatal 44109 1727204266.60409: done checking for any_errors_fatal 44109 1727204266.60410: checking for max_fail_percentage 44109 1727204266.60412: done checking for max_fail_percentage 44109 1727204266.60413: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.60414: done checking to see if all hosts have failed 44109 1727204266.60414: getting the remaining hosts for this loop 44109 1727204266.60416: done getting the remaining hosts for this loop 44109 1727204266.60419: getting the next task for host managed-node1 44109 1727204266.60426: done getting next task for host managed-node1 44109 1727204266.60428: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 44109 1727204266.60431: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.60434: getting variables 44109 1727204266.60435: in VariableManager get_vars() 44109 1727204266.60461: Calling all_inventory to load vars for managed-node1 44109 1727204266.60463: Calling groups_inventory to load vars for managed-node1 44109 1727204266.60466: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.60477: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.60479: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.60482: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.61252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.62146: done with get_vars() 44109 1727204266.62162: done getting variables 44109 1727204266.62206: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204266.62283: variable 'profile' from source: include params 44109 1727204266.62286: variable 'interface' from source: set_fact 44109 1727204266.62326: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.031) 0:00:43.419 ***** 44109 1727204266.62347: entering _queue_task() for managed-node1/assert 44109 1727204266.62568: worker is 1 (out of 1 available) 44109 1727204266.62582: exiting _queue_task() for managed-node1/assert 44109 1727204266.62594: done queuing things up, now waiting for results queue to drain 44109 1727204266.62595: waiting for pending results... 44109 1727204266.62769: running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' 44109 1727204266.62833: in run() - task 028d2410-947f-ed67-a560-00000000067c 44109 1727204266.62844: variable 'ansible_search_path' from source: unknown 44109 1727204266.62847: variable 'ansible_search_path' from source: unknown 44109 1727204266.62877: calling self._execute() 44109 1727204266.62952: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.62956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.62965: variable 'omit' from source: magic vars 44109 1727204266.63393: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.63397: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.63400: variable 'omit' from source: magic vars 44109 1727204266.63402: variable 'omit' from source: magic vars 44109 1727204266.63515: variable 'profile' from source: include params 44109 1727204266.63526: variable 'interface' from source: set_fact 44109 1727204266.63591: variable 'interface' from source: set_fact 44109 1727204266.63624: variable 'omit' from source: magic vars 44109 1727204266.63668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204266.63709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204266.63742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204266.63764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204266.63831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204266.63835: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204266.63837: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.63839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.63948: Set connection var ansible_connection to ssh 44109 1727204266.63960: Set connection var ansible_timeout to 10 44109 1727204266.63970: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204266.63985: Set connection var ansible_pipelining to False 44109 1727204266.63995: Set connection var ansible_shell_executable to /bin/sh 44109 1727204266.64006: Set connection var ansible_shell_type to sh 44109 1727204266.64074: variable 'ansible_shell_executable' from source: unknown 44109 1727204266.64079: variable 'ansible_connection' from source: unknown 44109 1727204266.64082: variable 'ansible_module_compression' from source: unknown 44109 1727204266.64084: variable 'ansible_shell_type' from source: unknown 44109 1727204266.64086: variable 'ansible_shell_executable' from source: unknown 44109 1727204266.64088: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.64090: variable 'ansible_pipelining' from source: unknown 44109 1727204266.64092: variable 'ansible_timeout' from source: unknown 44109 1727204266.64094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.64187: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204266.64197: variable 'omit' from source: magic vars 44109 1727204266.64202: starting attempt loop 44109 1727204266.64205: running the handler 44109 1727204266.64291: variable 'lsr_net_profile_exists' from source: set_fact 44109 1727204266.64295: Evaluated conditional (not lsr_net_profile_exists): True 44109 1727204266.64301: handler run complete 44109 1727204266.64313: attempt loop complete, returning result 44109 1727204266.64318: _execute() done 44109 1727204266.64321: dumping result to json 44109 1727204266.64323: done dumping result, returning 44109 1727204266.64329: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' [028d2410-947f-ed67-a560-00000000067c] 44109 1727204266.64333: sending task result for task 028d2410-947f-ed67-a560-00000000067c ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204266.64457: no more pending results, returning what we have 44109 1727204266.64461: results queue empty 44109 1727204266.64462: checking for any_errors_fatal 44109 1727204266.64468: done checking for any_errors_fatal 44109 1727204266.64469: checking for max_fail_percentage 44109 1727204266.64470: done checking for max_fail_percentage 44109 1727204266.64471: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.64472: done checking to see if all hosts have failed 44109 1727204266.64473: getting the remaining hosts for this loop 44109 1727204266.64474: done getting the remaining hosts for this loop 44109 1727204266.64483: getting the next task for host managed-node1 44109 1727204266.64491: done getting next task for host managed-node1 44109 1727204266.64495: ^ task is: TASK: Include the task 'assert_device_absent.yml' 44109 1727204266.64497: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.64501: getting variables 44109 1727204266.64502: in VariableManager get_vars() 44109 1727204266.64530: Calling all_inventory to load vars for managed-node1 44109 1727204266.64532: Calling groups_inventory to load vars for managed-node1 44109 1727204266.64536: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.64542: done sending task result for task 028d2410-947f-ed67-a560-00000000067c 44109 1727204266.64544: WORKER PROCESS EXITING 44109 1727204266.64553: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.64556: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.64558: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.65451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.66916: done with get_vars() 44109 1727204266.66938: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:234 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.046) 0:00:43.466 ***** 44109 1727204266.67029: entering _queue_task() for managed-node1/include_tasks 44109 1727204266.67336: worker is 1 (out of 1 available) 44109 1727204266.67347: exiting _queue_task() for managed-node1/include_tasks 44109 1727204266.67357: done queuing things up, now waiting for results queue to drain 44109 1727204266.67358: waiting for pending results... 44109 1727204266.67646: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_absent.yml' 44109 1727204266.67785: in run() - task 028d2410-947f-ed67-a560-0000000000aa 44109 1727204266.67789: variable 'ansible_search_path' from source: unknown 44109 1727204266.67792: calling self._execute() 44109 1727204266.67866: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.67872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.67884: variable 'omit' from source: magic vars 44109 1727204266.68254: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.68265: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.68271: _execute() done 44109 1727204266.68275: dumping result to json 44109 1727204266.68279: done dumping result, returning 44109 1727204266.68480: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_absent.yml' [028d2410-947f-ed67-a560-0000000000aa] 44109 1727204266.68484: sending task result for task 028d2410-947f-ed67-a560-0000000000aa 44109 1727204266.68548: done sending task result for task 028d2410-947f-ed67-a560-0000000000aa 44109 1727204266.68551: WORKER PROCESS EXITING 44109 1727204266.68573: no more pending results, returning what we have 44109 1727204266.68579: in VariableManager get_vars() 44109 1727204266.68607: Calling all_inventory to load vars for managed-node1 44109 1727204266.68610: Calling groups_inventory to load vars for managed-node1 44109 1727204266.68615: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.68625: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.68628: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.68631: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.69959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.71695: done with get_vars() 44109 1727204266.71716: variable 'ansible_search_path' from source: unknown 44109 1727204266.71731: we have included files to process 44109 1727204266.71732: generating all_blocks data 44109 1727204266.71734: done generating all_blocks data 44109 1727204266.71740: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44109 1727204266.71741: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44109 1727204266.71744: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44109 1727204266.71904: in VariableManager get_vars() 44109 1727204266.71922: done with get_vars() 44109 1727204266.72029: done processing included file 44109 1727204266.72031: iterating over new_blocks loaded from include file 44109 1727204266.72032: in VariableManager get_vars() 44109 1727204266.72043: done with get_vars() 44109 1727204266.72044: filtering new block on tags 44109 1727204266.72060: done filtering new block on tags 44109 1727204266.72063: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node1 44109 1727204266.72069: extending task lists for all hosts with included blocks 44109 1727204266.72245: done extending task lists 44109 1727204266.72247: done processing included files 44109 1727204266.72248: results queue empty 44109 1727204266.72249: checking for any_errors_fatal 44109 1727204266.72252: done checking for any_errors_fatal 44109 1727204266.72253: checking for max_fail_percentage 44109 1727204266.72254: done checking for max_fail_percentage 44109 1727204266.72255: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.72255: done checking to see if all hosts have failed 44109 1727204266.72256: getting the remaining hosts for this loop 44109 1727204266.72257: done getting the remaining hosts for this loop 44109 1727204266.72260: getting the next task for host managed-node1 44109 1727204266.72263: done getting next task for host managed-node1 44109 1727204266.72266: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44109 1727204266.72268: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.72270: getting variables 44109 1727204266.72271: in VariableManager get_vars() 44109 1727204266.72281: Calling all_inventory to load vars for managed-node1 44109 1727204266.72283: Calling groups_inventory to load vars for managed-node1 44109 1727204266.72285: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.72290: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.72293: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.72295: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.73485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.75078: done with get_vars() 44109 1727204266.75098: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.081) 0:00:43.548 ***** 44109 1727204266.75178: entering _queue_task() for managed-node1/include_tasks 44109 1727204266.75524: worker is 1 (out of 1 available) 44109 1727204266.75537: exiting _queue_task() for managed-node1/include_tasks 44109 1727204266.75548: done queuing things up, now waiting for results queue to drain 44109 1727204266.75549: waiting for pending results... 44109 1727204266.75835: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 44109 1727204266.75934: in run() - task 028d2410-947f-ed67-a560-0000000006c4 44109 1727204266.75946: variable 'ansible_search_path' from source: unknown 44109 1727204266.75949: variable 'ansible_search_path' from source: unknown 44109 1727204266.75985: calling self._execute() 44109 1727204266.76079: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.76084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.76094: variable 'omit' from source: magic vars 44109 1727204266.76465: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.76478: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.76484: _execute() done 44109 1727204266.76487: dumping result to json 44109 1727204266.76491: done dumping result, returning 44109 1727204266.76498: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-ed67-a560-0000000006c4] 44109 1727204266.76503: sending task result for task 028d2410-947f-ed67-a560-0000000006c4 44109 1727204266.76596: done sending task result for task 028d2410-947f-ed67-a560-0000000006c4 44109 1727204266.76600: WORKER PROCESS EXITING 44109 1727204266.76631: no more pending results, returning what we have 44109 1727204266.76637: in VariableManager get_vars() 44109 1727204266.76670: Calling all_inventory to load vars for managed-node1 44109 1727204266.76673: Calling groups_inventory to load vars for managed-node1 44109 1727204266.76679: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.76693: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.76696: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.76699: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.78333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.79915: done with get_vars() 44109 1727204266.79942: variable 'ansible_search_path' from source: unknown 44109 1727204266.79943: variable 'ansible_search_path' from source: unknown 44109 1727204266.79985: we have included files to process 44109 1727204266.79986: generating all_blocks data 44109 1727204266.79988: done generating all_blocks data 44109 1727204266.79989: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44109 1727204266.79990: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44109 1727204266.79992: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44109 1727204266.80188: done processing included file 44109 1727204266.80190: iterating over new_blocks loaded from include file 44109 1727204266.80192: in VariableManager get_vars() 44109 1727204266.80205: done with get_vars() 44109 1727204266.80207: filtering new block on tags 44109 1727204266.80224: done filtering new block on tags 44109 1727204266.80226: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 44109 1727204266.80231: extending task lists for all hosts with included blocks 44109 1727204266.80337: done extending task lists 44109 1727204266.80339: done processing included files 44109 1727204266.80340: results queue empty 44109 1727204266.80340: checking for any_errors_fatal 44109 1727204266.80344: done checking for any_errors_fatal 44109 1727204266.80344: checking for max_fail_percentage 44109 1727204266.80346: done checking for max_fail_percentage 44109 1727204266.80346: checking to see if all hosts have failed and the running result is not ok 44109 1727204266.80347: done checking to see if all hosts have failed 44109 1727204266.80348: getting the remaining hosts for this loop 44109 1727204266.80349: done getting the remaining hosts for this loop 44109 1727204266.80352: getting the next task for host managed-node1 44109 1727204266.80356: done getting next task for host managed-node1 44109 1727204266.80358: ^ task is: TASK: Get stat for interface {{ interface }} 44109 1727204266.80361: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204266.80363: getting variables 44109 1727204266.80364: in VariableManager get_vars() 44109 1727204266.80373: Calling all_inventory to load vars for managed-node1 44109 1727204266.80378: Calling groups_inventory to load vars for managed-node1 44109 1727204266.80381: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204266.80386: Calling all_plugins_play to load vars for managed-node1 44109 1727204266.80388: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204266.80391: Calling groups_plugins_play to load vars for managed-node1 44109 1727204266.81555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204266.83153: done with get_vars() 44109 1727204266.83179: done getting variables 44109 1727204266.83342: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.081) 0:00:43.630 ***** 44109 1727204266.83373: entering _queue_task() for managed-node1/stat 44109 1727204266.83725: worker is 1 (out of 1 available) 44109 1727204266.83735: exiting _queue_task() for managed-node1/stat 44109 1727204266.83746: done queuing things up, now waiting for results queue to drain 44109 1727204266.83747: waiting for pending results... 44109 1727204266.84192: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 44109 1727204266.84202: in run() - task 028d2410-947f-ed67-a560-0000000006de 44109 1727204266.84206: variable 'ansible_search_path' from source: unknown 44109 1727204266.84208: variable 'ansible_search_path' from source: unknown 44109 1727204266.84211: calling self._execute() 44109 1727204266.84286: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.84292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.84301: variable 'omit' from source: magic vars 44109 1727204266.84679: variable 'ansible_distribution_major_version' from source: facts 44109 1727204266.84688: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204266.84695: variable 'omit' from source: magic vars 44109 1727204266.84738: variable 'omit' from source: magic vars 44109 1727204266.84829: variable 'interface' from source: set_fact 44109 1727204266.84844: variable 'omit' from source: magic vars 44109 1727204266.85182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204266.85186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204266.85189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204266.85191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204266.85194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204266.85196: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204266.85199: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.85202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.85204: Set connection var ansible_connection to ssh 44109 1727204266.85206: Set connection var ansible_timeout to 10 44109 1727204266.85209: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204266.85211: Set connection var ansible_pipelining to False 44109 1727204266.85213: Set connection var ansible_shell_executable to /bin/sh 44109 1727204266.85215: Set connection var ansible_shell_type to sh 44109 1727204266.85218: variable 'ansible_shell_executable' from source: unknown 44109 1727204266.85220: variable 'ansible_connection' from source: unknown 44109 1727204266.85222: variable 'ansible_module_compression' from source: unknown 44109 1727204266.85224: variable 'ansible_shell_type' from source: unknown 44109 1727204266.85226: variable 'ansible_shell_executable' from source: unknown 44109 1727204266.85228: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204266.85230: variable 'ansible_pipelining' from source: unknown 44109 1727204266.85232: variable 'ansible_timeout' from source: unknown 44109 1727204266.85235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204266.85440: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 44109 1727204266.85452: variable 'omit' from source: magic vars 44109 1727204266.85456: starting attempt loop 44109 1727204266.85459: running the handler 44109 1727204266.85477: _low_level_execute_command(): starting 44109 1727204266.85485: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204266.86244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204266.86256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204266.86273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204266.86293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204266.86307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204266.86319: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204266.86390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204266.86421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204266.86433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204266.86452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.86562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.88354: stdout chunk (state=3): >>>/root <<< 44109 1727204266.88562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.88565: stdout chunk (state=3): >>><<< 44109 1727204266.88567: stderr chunk (state=3): >>><<< 44109 1727204266.88591: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204266.88702: _low_level_execute_command(): starting 44109 1727204266.88707: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537 `" && echo ansible-tmp-1727204266.885995-47054-65229159477537="` echo /root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537 `" ) && sleep 0' 44109 1727204266.89443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204266.89473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204266.89517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204266.89595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204266.89674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204266.89695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204266.89746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.89871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.91986: stdout chunk (state=3): >>>ansible-tmp-1727204266.885995-47054-65229159477537=/root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537 <<< 44109 1727204266.92282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.92286: stdout chunk (state=3): >>><<< 44109 1727204266.92289: stderr chunk (state=3): >>><<< 44109 1727204266.92291: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204266.885995-47054-65229159477537=/root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204266.92293: variable 'ansible_module_compression' from source: unknown 44109 1727204266.92295: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44109 1727204266.92345: variable 'ansible_facts' from source: unknown 44109 1727204266.92439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/AnsiballZ_stat.py 44109 1727204266.92650: Sending initial data 44109 1727204266.92653: Sent initial data (151 bytes) 44109 1727204266.93222: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204266.93236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204266.93250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204266.93292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204266.93303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204266.93317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204266.93400: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204266.93421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.93530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.95328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204266.95425: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204266.95514: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmptb306yin /root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/AnsiballZ_stat.py <<< 44109 1727204266.95518: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/AnsiballZ_stat.py" <<< 44109 1727204266.95583: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmptb306yin" to remote "/root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/AnsiballZ_stat.py" <<< 44109 1727204266.96638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.96642: stdout chunk (state=3): >>><<< 44109 1727204266.96644: stderr chunk (state=3): >>><<< 44109 1727204266.96646: done transferring module to remote 44109 1727204266.96648: _low_level_execute_command(): starting 44109 1727204266.96650: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/ /root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/AnsiballZ_stat.py && sleep 0' 44109 1727204266.97293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204266.97330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204266.97347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204266.97366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204266.97476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204266.99480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204266.99492: stdout chunk (state=3): >>><<< 44109 1727204266.99503: stderr chunk (state=3): >>><<< 44109 1727204266.99527: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204266.99535: _low_level_execute_command(): starting 44109 1727204266.99544: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/AnsiballZ_stat.py && sleep 0' 44109 1727204267.00166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204267.00181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204267.00198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204267.00220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204267.00243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204267.00255: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204267.00343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204267.00360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204267.00384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204267.00581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204267.17053: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44109 1727204267.18557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204267.18579: stderr chunk (state=3): >>><<< 44109 1727204267.18582: stdout chunk (state=3): >>><<< 44109 1727204267.18598: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204267.18622: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204267.18630: _low_level_execute_command(): starting 44109 1727204267.18635: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204266.885995-47054-65229159477537/ > /dev/null 2>&1 && sleep 0' 44109 1727204267.19043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204267.19047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 44109 1727204267.19074: stderr chunk (state=3): >>>debug2: match not found <<< 44109 1727204267.19080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44109 1727204267.19083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204267.19085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204267.19161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204267.19232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204267.21187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204267.21211: stderr chunk (state=3): >>><<< 44109 1727204267.21217: stdout chunk (state=3): >>><<< 44109 1727204267.21228: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204267.21238: handler run complete 44109 1727204267.21254: attempt loop complete, returning result 44109 1727204267.21257: _execute() done 44109 1727204267.21260: dumping result to json 44109 1727204267.21262: done dumping result, returning 44109 1727204267.21271: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [028d2410-947f-ed67-a560-0000000006de] 44109 1727204267.21273: sending task result for task 028d2410-947f-ed67-a560-0000000006de 44109 1727204267.21368: done sending task result for task 028d2410-947f-ed67-a560-0000000006de 44109 1727204267.21371: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 44109 1727204267.21430: no more pending results, returning what we have 44109 1727204267.21434: results queue empty 44109 1727204267.21435: checking for any_errors_fatal 44109 1727204267.21437: done checking for any_errors_fatal 44109 1727204267.21437: checking for max_fail_percentage 44109 1727204267.21439: done checking for max_fail_percentage 44109 1727204267.21440: checking to see if all hosts have failed and the running result is not ok 44109 1727204267.21440: done checking to see if all hosts have failed 44109 1727204267.21441: getting the remaining hosts for this loop 44109 1727204267.21442: done getting the remaining hosts for this loop 44109 1727204267.21446: getting the next task for host managed-node1 44109 1727204267.21453: done getting next task for host managed-node1 44109 1727204267.21455: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 44109 1727204267.21458: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204267.21462: getting variables 44109 1727204267.21464: in VariableManager get_vars() 44109 1727204267.21498: Calling all_inventory to load vars for managed-node1 44109 1727204267.21502: Calling groups_inventory to load vars for managed-node1 44109 1727204267.21506: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204267.21519: Calling all_plugins_play to load vars for managed-node1 44109 1727204267.21522: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204267.21524: Calling groups_plugins_play to load vars for managed-node1 44109 1727204267.22407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204267.23291: done with get_vars() 44109 1727204267.23310: done getting variables 44109 1727204267.23356: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 44109 1727204267.23457: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.401) 0:00:44.031 ***** 44109 1727204267.23482: entering _queue_task() for managed-node1/assert 44109 1727204267.23742: worker is 1 (out of 1 available) 44109 1727204267.23755: exiting _queue_task() for managed-node1/assert 44109 1727204267.23766: done queuing things up, now waiting for results queue to drain 44109 1727204267.23767: waiting for pending results... 44109 1727204267.23944: running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' 44109 1727204267.24019: in run() - task 028d2410-947f-ed67-a560-0000000006c5 44109 1727204267.24030: variable 'ansible_search_path' from source: unknown 44109 1727204267.24033: variable 'ansible_search_path' from source: unknown 44109 1727204267.24061: calling self._execute() 44109 1727204267.24140: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204267.24144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204267.24281: variable 'omit' from source: magic vars 44109 1727204267.24521: variable 'ansible_distribution_major_version' from source: facts 44109 1727204267.24538: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204267.24550: variable 'omit' from source: magic vars 44109 1727204267.24597: variable 'omit' from source: magic vars 44109 1727204267.24695: variable 'interface' from source: set_fact 44109 1727204267.24719: variable 'omit' from source: magic vars 44109 1727204267.24764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204267.24808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204267.24833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204267.24856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204267.24872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204267.24912: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204267.24921: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204267.24929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204267.25036: Set connection var ansible_connection to ssh 44109 1727204267.25048: Set connection var ansible_timeout to 10 44109 1727204267.25058: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204267.25069: Set connection var ansible_pipelining to False 44109 1727204267.25082: Set connection var ansible_shell_executable to /bin/sh 44109 1727204267.25091: Set connection var ansible_shell_type to sh 44109 1727204267.25117: variable 'ansible_shell_executable' from source: unknown 44109 1727204267.25128: variable 'ansible_connection' from source: unknown 44109 1727204267.25135: variable 'ansible_module_compression' from source: unknown 44109 1727204267.25141: variable 'ansible_shell_type' from source: unknown 44109 1727204267.25149: variable 'ansible_shell_executable' from source: unknown 44109 1727204267.25155: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204267.25162: variable 'ansible_pipelining' from source: unknown 44109 1727204267.25283: variable 'ansible_timeout' from source: unknown 44109 1727204267.25286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204267.25330: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204267.25348: variable 'omit' from source: magic vars 44109 1727204267.25358: starting attempt loop 44109 1727204267.25365: running the handler 44109 1727204267.25512: variable 'interface_stat' from source: set_fact 44109 1727204267.25527: Evaluated conditional (not interface_stat.stat.exists): True 44109 1727204267.25538: handler run complete 44109 1727204267.25558: attempt loop complete, returning result 44109 1727204267.25566: _execute() done 44109 1727204267.25577: dumping result to json 44109 1727204267.25585: done dumping result, returning 44109 1727204267.25683: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' [028d2410-947f-ed67-a560-0000000006c5] 44109 1727204267.25686: sending task result for task 028d2410-947f-ed67-a560-0000000006c5 44109 1727204267.25755: done sending task result for task 028d2410-947f-ed67-a560-0000000006c5 44109 1727204267.25759: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44109 1727204267.25835: no more pending results, returning what we have 44109 1727204267.25839: results queue empty 44109 1727204267.25840: checking for any_errors_fatal 44109 1727204267.25851: done checking for any_errors_fatal 44109 1727204267.25852: checking for max_fail_percentage 44109 1727204267.25854: done checking for max_fail_percentage 44109 1727204267.25855: checking to see if all hosts have failed and the running result is not ok 44109 1727204267.25856: done checking to see if all hosts have failed 44109 1727204267.25857: getting the remaining hosts for this loop 44109 1727204267.25858: done getting the remaining hosts for this loop 44109 1727204267.25862: getting the next task for host managed-node1 44109 1727204267.25869: done getting next task for host managed-node1 44109 1727204267.25873: ^ task is: TASK: Verify network state restored to default 44109 1727204267.25877: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204267.25882: getting variables 44109 1727204267.25884: in VariableManager get_vars() 44109 1727204267.25914: Calling all_inventory to load vars for managed-node1 44109 1727204267.25918: Calling groups_inventory to load vars for managed-node1 44109 1727204267.25922: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204267.25934: Calling all_plugins_play to load vars for managed-node1 44109 1727204267.25937: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204267.25941: Calling groups_plugins_play to load vars for managed-node1 44109 1727204267.28163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204267.30493: done with get_vars() 44109 1727204267.30522: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:236 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.071) 0:00:44.103 ***** 44109 1727204267.30677: entering _queue_task() for managed-node1/include_tasks 44109 1727204267.31096: worker is 1 (out of 1 available) 44109 1727204267.31109: exiting _queue_task() for managed-node1/include_tasks 44109 1727204267.31122: done queuing things up, now waiting for results queue to drain 44109 1727204267.31123: waiting for pending results... 44109 1727204267.31541: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 44109 1727204267.31632: in run() - task 028d2410-947f-ed67-a560-0000000000ab 44109 1727204267.31655: variable 'ansible_search_path' from source: unknown 44109 1727204267.31718: calling self._execute() 44109 1727204267.31853: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204267.31866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204267.31888: variable 'omit' from source: magic vars 44109 1727204267.32426: variable 'ansible_distribution_major_version' from source: facts 44109 1727204267.32444: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204267.32455: _execute() done 44109 1727204267.32463: dumping result to json 44109 1727204267.32478: done dumping result, returning 44109 1727204267.32499: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [028d2410-947f-ed67-a560-0000000000ab] 44109 1727204267.32503: sending task result for task 028d2410-947f-ed67-a560-0000000000ab 44109 1727204267.32883: done sending task result for task 028d2410-947f-ed67-a560-0000000000ab 44109 1727204267.32886: WORKER PROCESS EXITING 44109 1727204267.32916: no more pending results, returning what we have 44109 1727204267.32922: in VariableManager get_vars() 44109 1727204267.32957: Calling all_inventory to load vars for managed-node1 44109 1727204267.32960: Calling groups_inventory to load vars for managed-node1 44109 1727204267.32964: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204267.32979: Calling all_plugins_play to load vars for managed-node1 44109 1727204267.32982: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204267.32986: Calling groups_plugins_play to load vars for managed-node1 44109 1727204267.34863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204267.36789: done with get_vars() 44109 1727204267.36822: variable 'ansible_search_path' from source: unknown 44109 1727204267.36839: we have included files to process 44109 1727204267.36841: generating all_blocks data 44109 1727204267.36843: done generating all_blocks data 44109 1727204267.36847: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44109 1727204267.36849: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44109 1727204267.36856: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44109 1727204267.37301: done processing included file 44109 1727204267.37303: iterating over new_blocks loaded from include file 44109 1727204267.37305: in VariableManager get_vars() 44109 1727204267.37319: done with get_vars() 44109 1727204267.37322: filtering new block on tags 44109 1727204267.37339: done filtering new block on tags 44109 1727204267.37342: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 44109 1727204267.37348: extending task lists for all hosts with included blocks 44109 1727204267.37665: done extending task lists 44109 1727204267.37667: done processing included files 44109 1727204267.37667: results queue empty 44109 1727204267.37668: checking for any_errors_fatal 44109 1727204267.37678: done checking for any_errors_fatal 44109 1727204267.37679: checking for max_fail_percentage 44109 1727204267.37680: done checking for max_fail_percentage 44109 1727204267.37681: checking to see if all hosts have failed and the running result is not ok 44109 1727204267.37682: done checking to see if all hosts have failed 44109 1727204267.37683: getting the remaining hosts for this loop 44109 1727204267.37684: done getting the remaining hosts for this loop 44109 1727204267.37687: getting the next task for host managed-node1 44109 1727204267.37691: done getting next task for host managed-node1 44109 1727204267.37694: ^ task is: TASK: Check routes and DNS 44109 1727204267.37696: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204267.37698: getting variables 44109 1727204267.37699: in VariableManager get_vars() 44109 1727204267.37709: Calling all_inventory to load vars for managed-node1 44109 1727204267.37711: Calling groups_inventory to load vars for managed-node1 44109 1727204267.37717: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204267.37836: Calling all_plugins_play to load vars for managed-node1 44109 1727204267.37840: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204267.37844: Calling groups_plugins_play to load vars for managed-node1 44109 1727204267.42586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204267.46529: done with get_vars() 44109 1727204267.46566: done getting variables 44109 1727204267.46817: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.161) 0:00:44.264 ***** 44109 1727204267.46849: entering _queue_task() for managed-node1/shell 44109 1727204267.47633: worker is 1 (out of 1 available) 44109 1727204267.47645: exiting _queue_task() for managed-node1/shell 44109 1727204267.47657: done queuing things up, now waiting for results queue to drain 44109 1727204267.47658: waiting for pending results... 44109 1727204267.47998: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 44109 1727204267.48194: in run() - task 028d2410-947f-ed67-a560-0000000006f6 44109 1727204267.48219: variable 'ansible_search_path' from source: unknown 44109 1727204267.48229: variable 'ansible_search_path' from source: unknown 44109 1727204267.48266: calling self._execute() 44109 1727204267.48781: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204267.48785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204267.48787: variable 'omit' from source: magic vars 44109 1727204267.49136: variable 'ansible_distribution_major_version' from source: facts 44109 1727204267.49580: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204267.49584: variable 'omit' from source: magic vars 44109 1727204267.49587: variable 'omit' from source: magic vars 44109 1727204267.49589: variable 'omit' from source: magic vars 44109 1727204267.49591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204267.49593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204267.49595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204267.49701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204267.49724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204267.49915: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204267.49927: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204267.49935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204267.50046: Set connection var ansible_connection to ssh 44109 1727204267.50480: Set connection var ansible_timeout to 10 44109 1727204267.50484: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204267.50486: Set connection var ansible_pipelining to False 44109 1727204267.50488: Set connection var ansible_shell_executable to /bin/sh 44109 1727204267.50490: Set connection var ansible_shell_type to sh 44109 1727204267.50492: variable 'ansible_shell_executable' from source: unknown 44109 1727204267.50494: variable 'ansible_connection' from source: unknown 44109 1727204267.50496: variable 'ansible_module_compression' from source: unknown 44109 1727204267.50498: variable 'ansible_shell_type' from source: unknown 44109 1727204267.50500: variable 'ansible_shell_executable' from source: unknown 44109 1727204267.50502: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204267.50503: variable 'ansible_pipelining' from source: unknown 44109 1727204267.50505: variable 'ansible_timeout' from source: unknown 44109 1727204267.50507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204267.50535: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204267.50974: variable 'omit' from source: magic vars 44109 1727204267.50979: starting attempt loop 44109 1727204267.50981: running the handler 44109 1727204267.50984: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204267.50987: _low_level_execute_command(): starting 44109 1727204267.50989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204267.52272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204267.52494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204267.52510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204267.52787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204267.54591: stdout chunk (state=3): >>>/root <<< 44109 1727204267.54725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204267.54738: stdout chunk (state=3): >>><<< 44109 1727204267.54755: stderr chunk (state=3): >>><<< 44109 1727204267.54783: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204267.54815: _low_level_execute_command(): starting 44109 1727204267.54990: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997 `" && echo ansible-tmp-1727204267.5479763-47117-80978328982997="` echo /root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997 `" ) && sleep 0' 44109 1727204267.56282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204267.56295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204267.56321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204267.56366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204267.56597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204267.56616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204267.56724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204267.58832: stdout chunk (state=3): >>>ansible-tmp-1727204267.5479763-47117-80978328982997=/root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997 <<< 44109 1727204267.58946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204267.58980: stderr chunk (state=3): >>><<< 44109 1727204267.58989: stdout chunk (state=3): >>><<< 44109 1727204267.59017: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204267.5479763-47117-80978328982997=/root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204267.59062: variable 'ansible_module_compression' from source: unknown 44109 1727204267.59137: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204267.59306: variable 'ansible_facts' from source: unknown 44109 1727204267.59412: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/AnsiballZ_command.py 44109 1727204267.59920: Sending initial data 44109 1727204267.59930: Sent initial data (155 bytes) 44109 1727204267.60968: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204267.61181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204267.61394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204267.61598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204267.63356: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204267.63442: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204267.63923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmpeuv9p4fw /root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/AnsiballZ_command.py <<< 44109 1727204267.63927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmpeuv9p4fw" to remote "/root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/AnsiballZ_command.py" <<< 44109 1727204267.65698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204267.65982: stderr chunk (state=3): >>><<< 44109 1727204267.65985: stdout chunk (state=3): >>><<< 44109 1727204267.65987: done transferring module to remote 44109 1727204267.65989: _low_level_execute_command(): starting 44109 1727204267.66089: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/ /root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/AnsiballZ_command.py && sleep 0' 44109 1727204267.67264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204267.67282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204267.67291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44109 1727204267.67308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204267.67377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204267.67519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204267.67545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204267.68017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204267.70000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204267.70068: stderr chunk (state=3): >>><<< 44109 1727204267.70078: stdout chunk (state=3): >>><<< 44109 1727204267.70230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204267.70237: _low_level_execute_command(): starting 44109 1727204267.70242: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/AnsiballZ_command.py && sleep 0' 44109 1727204267.71958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204267.71962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204267.71965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204267.71968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204267.71970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204267.71972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204267.72180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204267.89373: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2776sec preferred_lft 2776sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:57:47.882407", "end": "2024-09-24 14:57:47.891656", "delta": "0:00:00.009249", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204267.91126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 44109 1727204267.91228: stderr chunk (state=3): >>><<< 44109 1727204267.91238: stdout chunk (state=3): >>><<< 44109 1727204267.91262: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2776sec preferred_lft 2776sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:57:47.882407", "end": "2024-09-24 14:57:47.891656", "delta": "0:00:00.009249", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204267.91376: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204267.91581: _low_level_execute_command(): starting 44109 1727204267.91584: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204267.5479763-47117-80978328982997/ > /dev/null 2>&1 && sleep 0' 44109 1727204267.92894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204267.92932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204267.92986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204267.92989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204267.93100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204267.95309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204267.95316: stdout chunk (state=3): >>><<< 44109 1727204267.95319: stderr chunk (state=3): >>><<< 44109 1727204267.95545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204267.95550: handler run complete 44109 1727204267.95552: Evaluated conditional (False): False 44109 1727204267.95554: attempt loop complete, returning result 44109 1727204267.95556: _execute() done 44109 1727204267.95557: dumping result to json 44109 1727204267.95559: done dumping result, returning 44109 1727204267.95561: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [028d2410-947f-ed67-a560-0000000006f6] 44109 1727204267.95563: sending task result for task 028d2410-947f-ed67-a560-0000000006f6 44109 1727204267.95632: done sending task result for task 028d2410-947f-ed67-a560-0000000006f6 44109 1727204267.95635: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009249", "end": "2024-09-24 14:57:47.891656", "rc": 0, "start": "2024-09-24 14:57:47.882407" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2776sec preferred_lft 2776sec inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 44109 1727204267.95703: no more pending results, returning what we have 44109 1727204267.95708: results queue empty 44109 1727204267.95709: checking for any_errors_fatal 44109 1727204267.95711: done checking for any_errors_fatal 44109 1727204267.95711: checking for max_fail_percentage 44109 1727204267.95716: done checking for max_fail_percentage 44109 1727204267.95717: checking to see if all hosts have failed and the running result is not ok 44109 1727204267.95718: done checking to see if all hosts have failed 44109 1727204267.95719: getting the remaining hosts for this loop 44109 1727204267.95721: done getting the remaining hosts for this loop 44109 1727204267.95725: getting the next task for host managed-node1 44109 1727204267.95731: done getting next task for host managed-node1 44109 1727204267.95734: ^ task is: TASK: Verify DNS and network connectivity 44109 1727204267.95737: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204267.95742: getting variables 44109 1727204267.95743: in VariableManager get_vars() 44109 1727204267.95773: Calling all_inventory to load vars for managed-node1 44109 1727204267.95777: Calling groups_inventory to load vars for managed-node1 44109 1727204267.95781: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204267.95796: Calling all_plugins_play to load vars for managed-node1 44109 1727204267.95799: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204267.95801: Calling groups_plugins_play to load vars for managed-node1 44109 1727204267.98742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204268.02555: done with get_vars() 44109 1727204268.02591: done getting variables 44109 1727204268.02938: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.561) 0:00:44.825 ***** 44109 1727204268.02969: entering _queue_task() for managed-node1/shell 44109 1727204268.03925: worker is 1 (out of 1 available) 44109 1727204268.03937: exiting _queue_task() for managed-node1/shell 44109 1727204268.03946: done queuing things up, now waiting for results queue to drain 44109 1727204268.03947: waiting for pending results... 44109 1727204268.04794: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 44109 1727204268.04799: in run() - task 028d2410-947f-ed67-a560-0000000006f7 44109 1727204268.04802: variable 'ansible_search_path' from source: unknown 44109 1727204268.04805: variable 'ansible_search_path' from source: unknown 44109 1727204268.05001: calling self._execute() 44109 1727204268.05005: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204268.05008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204268.05011: variable 'omit' from source: magic vars 44109 1727204268.05874: variable 'ansible_distribution_major_version' from source: facts 44109 1727204268.05880: Evaluated conditional (ansible_distribution_major_version != '6'): True 44109 1727204268.05997: variable 'ansible_facts' from source: unknown 44109 1727204268.07452: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 44109 1727204268.07456: variable 'omit' from source: magic vars 44109 1727204268.07681: variable 'omit' from source: magic vars 44109 1727204268.07684: variable 'omit' from source: magic vars 44109 1727204268.07687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44109 1727204268.07716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44109 1727204268.07735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44109 1727204268.07753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204268.07765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44109 1727204268.08099: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44109 1727204268.08103: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204268.08106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204268.08204: Set connection var ansible_connection to ssh 44109 1727204268.08210: Set connection var ansible_timeout to 10 44109 1727204268.08217: Set connection var ansible_module_compression to ZIP_DEFLATED 44109 1727204268.08224: Set connection var ansible_pipelining to False 44109 1727204268.08229: Set connection var ansible_shell_executable to /bin/sh 44109 1727204268.08234: Set connection var ansible_shell_type to sh 44109 1727204268.08257: variable 'ansible_shell_executable' from source: unknown 44109 1727204268.08260: variable 'ansible_connection' from source: unknown 44109 1727204268.08263: variable 'ansible_module_compression' from source: unknown 44109 1727204268.08265: variable 'ansible_shell_type' from source: unknown 44109 1727204268.08268: variable 'ansible_shell_executable' from source: unknown 44109 1727204268.08270: variable 'ansible_host' from source: host vars for 'managed-node1' 44109 1727204268.08277: variable 'ansible_pipelining' from source: unknown 44109 1727204268.08484: variable 'ansible_timeout' from source: unknown 44109 1727204268.08488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44109 1727204268.08626: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204268.08688: variable 'omit' from source: magic vars 44109 1727204268.08691: starting attempt loop 44109 1727204268.08694: running the handler 44109 1727204268.08697: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 44109 1727204268.08699: _low_level_execute_command(): starting 44109 1727204268.08701: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44109 1727204268.10296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204268.10319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204268.10333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204268.10444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204268.12386: stdout chunk (state=3): >>>/root <<< 44109 1727204268.12459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204268.12466: stdout chunk (state=3): >>><<< 44109 1727204268.12474: stderr chunk (state=3): >>><<< 44109 1727204268.12498: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204268.12515: _low_level_execute_command(): starting 44109 1727204268.12519: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758 `" && echo ansible-tmp-1727204268.1249845-47190-26584793539758="` echo /root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758 `" ) && sleep 0' 44109 1727204268.13763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204268.14088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204268.14095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204268.16207: stdout chunk (state=3): >>>ansible-tmp-1727204268.1249845-47190-26584793539758=/root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758 <<< 44109 1727204268.16315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204268.16360: stderr chunk (state=3): >>><<< 44109 1727204268.16367: stdout chunk (state=3): >>><<< 44109 1727204268.16397: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204268.1249845-47190-26584793539758=/root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204268.16429: variable 'ansible_module_compression' from source: unknown 44109 1727204268.16482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44109pzfqangk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44109 1727204268.16520: variable 'ansible_facts' from source: unknown 44109 1727204268.16802: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/AnsiballZ_command.py 44109 1727204268.17128: Sending initial data 44109 1727204268.17131: Sent initial data (155 bytes) 44109 1727204268.18201: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204268.18535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204268.18539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204268.18542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204268.18616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204268.20353: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44109 1727204268.20431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44109 1727204268.20673: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44109pzfqangk/tmptlwux75l /root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/AnsiballZ_command.py <<< 44109 1727204268.20680: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/AnsiballZ_command.py" <<< 44109 1727204268.20752: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44109pzfqangk/tmptlwux75l" to remote "/root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/AnsiballZ_command.py" <<< 44109 1727204268.22690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204268.22694: stdout chunk (state=3): >>><<< 44109 1727204268.22700: stderr chunk (state=3): >>><<< 44109 1727204268.22733: done transferring module to remote 44109 1727204268.22743: _low_level_execute_command(): starting 44109 1727204268.22748: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/ /root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/AnsiballZ_command.py && sleep 0' 44109 1727204268.23800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204268.24083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 44109 1727204268.24105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204268.24123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204268.24380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204268.26362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204268.26504: stdout chunk (state=3): >>><<< 44109 1727204268.26507: stderr chunk (state=3): >>><<< 44109 1727204268.26585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204268.26595: _low_level_execute_command(): starting 44109 1727204268.26598: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/AnsiballZ_command.py && sleep 0' 44109 1727204268.27704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44109 1727204268.27792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44109 1727204268.27914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204268.28081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204268.28296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204268.52989: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 14684 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7850 0 --:--:-- --:--:-- --:--:-- 8083", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:57:48.444501", "end": "2024-09-24 14:57:48.527219", "delta": "0:00:00.082718", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44109 1727204268.54847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204268.54928: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 44109 1727204268.54932: stdout chunk (state=3): >>><<< 44109 1727204268.54934: stderr chunk (state=3): >>><<< 44109 1727204268.54958: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 14684 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7850 0 --:--:-- --:--:-- --:--:-- 8083", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:57:48.444501", "end": "2024-09-24 14:57:48.527219", "delta": "0:00:00.082718", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 44109 1727204268.55015: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44109 1727204268.55120: _low_level_execute_command(): starting 44109 1727204268.55181: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204268.1249845-47190-26584793539758/ > /dev/null 2>&1 && sleep 0' 44109 1727204268.56683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44109 1727204268.56687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44109 1727204268.56801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 44109 1727204268.56919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44109 1727204268.57128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44109 1727204268.59116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44109 1727204268.59120: stdout chunk (state=3): >>><<< 44109 1727204268.59123: stderr chunk (state=3): >>><<< 44109 1727204268.59196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44109 1727204268.59203: handler run complete 44109 1727204268.59233: Evaluated conditional (False): False 44109 1727204268.59243: attempt loop complete, returning result 44109 1727204268.59246: _execute() done 44109 1727204268.59248: dumping result to json 44109 1727204268.59255: done dumping result, returning 44109 1727204268.59263: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [028d2410-947f-ed67-a560-0000000006f7] 44109 1727204268.59266: sending task result for task 028d2410-947f-ed67-a560-0000000006f7 ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.082718", "end": "2024-09-24 14:57:48.527219", "rc": 0, "start": "2024-09-24 14:57:48.444501" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 14684 0 --:--:-- --:--:-- --:--:-- 15250 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 7850 0 --:--:-- --:--:-- --:--:-- 8083 44109 1727204268.59466: no more pending results, returning what we have 44109 1727204268.59471: results queue empty 44109 1727204268.59472: checking for any_errors_fatal 44109 1727204268.59484: done checking for any_errors_fatal 44109 1727204268.59485: checking for max_fail_percentage 44109 1727204268.59487: done checking for max_fail_percentage 44109 1727204268.59488: checking to see if all hosts have failed and the running result is not ok 44109 1727204268.59489: done checking to see if all hosts have failed 44109 1727204268.59490: getting the remaining hosts for this loop 44109 1727204268.59491: done getting the remaining hosts for this loop 44109 1727204268.59495: getting the next task for host managed-node1 44109 1727204268.59503: done getting next task for host managed-node1 44109 1727204268.59507: ^ task is: TASK: meta (flush_handlers) 44109 1727204268.59509: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204268.59521: getting variables 44109 1727204268.59523: in VariableManager get_vars() 44109 1727204268.59552: Calling all_inventory to load vars for managed-node1 44109 1727204268.59555: Calling groups_inventory to load vars for managed-node1 44109 1727204268.59559: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204268.59570: Calling all_plugins_play to load vars for managed-node1 44109 1727204268.59574: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204268.59579: Calling groups_plugins_play to load vars for managed-node1 44109 1727204268.60390: done sending task result for task 028d2410-947f-ed67-a560-0000000006f7 44109 1727204268.60393: WORKER PROCESS EXITING 44109 1727204268.62398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204268.66498: done with get_vars() 44109 1727204268.66536: done getting variables 44109 1727204268.66610: in VariableManager get_vars() 44109 1727204268.66624: Calling all_inventory to load vars for managed-node1 44109 1727204268.66627: Calling groups_inventory to load vars for managed-node1 44109 1727204268.66629: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204268.66634: Calling all_plugins_play to load vars for managed-node1 44109 1727204268.66637: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204268.66639: Calling groups_plugins_play to load vars for managed-node1 44109 1727204268.70718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204268.74264: done with get_vars() 44109 1727204268.74304: done queuing things up, now waiting for results queue to drain 44109 1727204268.74307: results queue empty 44109 1727204268.74308: checking for any_errors_fatal 44109 1727204268.74315: done checking for any_errors_fatal 44109 1727204268.74316: checking for max_fail_percentage 44109 1727204268.74317: done checking for max_fail_percentage 44109 1727204268.74318: checking to see if all hosts have failed and the running result is not ok 44109 1727204268.74318: done checking to see if all hosts have failed 44109 1727204268.74319: getting the remaining hosts for this loop 44109 1727204268.74320: done getting the remaining hosts for this loop 44109 1727204268.74323: getting the next task for host managed-node1 44109 1727204268.74327: done getting next task for host managed-node1 44109 1727204268.74328: ^ task is: TASK: meta (flush_handlers) 44109 1727204268.74330: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204268.74332: getting variables 44109 1727204268.74333: in VariableManager get_vars() 44109 1727204268.74343: Calling all_inventory to load vars for managed-node1 44109 1727204268.74345: Calling groups_inventory to load vars for managed-node1 44109 1727204268.74347: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204268.74353: Calling all_plugins_play to load vars for managed-node1 44109 1727204268.74355: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204268.74357: Calling groups_plugins_play to load vars for managed-node1 44109 1727204268.87896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204268.91345: done with get_vars() 44109 1727204268.91372: done getting variables 44109 1727204268.91630: in VariableManager get_vars() 44109 1727204268.91641: Calling all_inventory to load vars for managed-node1 44109 1727204268.91643: Calling groups_inventory to load vars for managed-node1 44109 1727204268.91645: Calling all_plugins_inventory to load vars for managed-node1 44109 1727204268.91649: Calling all_plugins_play to load vars for managed-node1 44109 1727204268.91651: Calling groups_plugins_inventory to load vars for managed-node1 44109 1727204268.91654: Calling groups_plugins_play to load vars for managed-node1 44109 1727204268.94025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44109 1727204268.97033: done with get_vars() 44109 1727204268.97060: done queuing things up, now waiting for results queue to drain 44109 1727204268.97062: results queue empty 44109 1727204268.97063: checking for any_errors_fatal 44109 1727204268.97064: done checking for any_errors_fatal 44109 1727204268.97065: checking for max_fail_percentage 44109 1727204268.97066: done checking for max_fail_percentage 44109 1727204268.97067: checking to see if all hosts have failed and the running result is not ok 44109 1727204268.97068: done checking to see if all hosts have failed 44109 1727204268.97068: getting the remaining hosts for this loop 44109 1727204268.97069: done getting the remaining hosts for this loop 44109 1727204268.97072: getting the next task for host managed-node1 44109 1727204268.97077: done getting next task for host managed-node1 44109 1727204268.97078: ^ task is: None 44109 1727204268.97079: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44109 1727204268.97081: done queuing things up, now waiting for results queue to drain 44109 1727204268.97081: results queue empty 44109 1727204268.97082: checking for any_errors_fatal 44109 1727204268.97083: done checking for any_errors_fatal 44109 1727204268.97083: checking for max_fail_percentage 44109 1727204268.97084: done checking for max_fail_percentage 44109 1727204268.97085: checking to see if all hosts have failed and the running result is not ok 44109 1727204268.97086: done checking to see if all hosts have failed 44109 1727204268.97087: getting the next task for host managed-node1 44109 1727204268.97089: done getting next task for host managed-node1 44109 1727204268.97090: ^ task is: None 44109 1727204268.97091: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=87 changed=5 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.941) 0:00:45.767 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.25s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.05s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.03s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.03s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Create veth interface ethtest0 ------------------------------------------ 1.40s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.25s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install iproute --------------------------------------------------------- 1.20s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.13s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 fedora.linux_system_roles.network : Check which packages are installed --- 1.01s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.00s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Verify DNS and network connectivity ------------------------------------- 0.94s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 0.93s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gather the minimum subset of ansible_facts required by the network role test --- 0.88s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.88s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Check if system is ostree ----------------------------------------------- 0.78s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.76s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 44109 1727204268.97187: RUNNING CLEANUP